Giter Site home page Giter Site logo

flare's People

Contributors

austin-taylor avatar e-urban avatar jonathanburkert avatar justinhendersonsmapper avatar pberba avatar redsand avatar smapper avatar tbennett6421 avatar vchan-in avatar vpiserchia avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

flare's Issues

Connecting to elasticsearch without auth

Hi,

I get this bug "'elasticBeacon' object has no attribute 'auth'" . I have seen that fixes have been made regarding connecting to es without auth credentials so I don't understand why is it now a problem?

Thank you

flare_beacon not respecting suricata_defaults in config

$ flare_beacon -gr -who -fo -html beacon.html -c configs/elasticsearch.ini
[INFO] Attempting to connect to elasticsearch...
[SUCCESS] Connected to elasticsearch on localhost:9201
[INFO] Gathering flow data... this may take a while...
[DEBUG] {'query': {'bool': {'filter': {'bool': {'must_not': [], 'must': [{'range': {'timestamp': {'gte': 1497481356845, 'lte': 1497567756845, 'format': 'epoch_millis'}}}]}}, 'must': {'query_string': {'query': '*', 'analyze_wildcard': 'true'}}}}, '_source': ['saddr', 'daddr', 'dport', 'timet', 'flow.sbyte', '_id']}

$ grep suricata configs/elasticsearch.ini
#set to false if using suricata defaults if you have custom fields
suricata_defaults = False

also tried setting it to True just in case the logic was reversed. In my local instance, I changed
self.suricata_defaults = self.config.get('beacon','suricata_defaults')
to
self.suricata_defaults = False
and it performed as expected.

I've been unable to figure out this one.

Python elasticsearch module version 8.x doesn't support RequestsHttpConnection class

Currently there is no version specified in the requirements.txt for the module elasticsearch. By default this installs the latest version which currently does not work with the project as Transport classes are deprecated starting from version 8.0.

When trying to run the project, the following message prints: https://github.com/austin-taylor/flare/blob/master/flare/analytics/command_control.py#L16.

Possible solutions:

  • modify the requirements.txt and specify version 7.17.2 which is the last version that supports custom Transport classes.
  • Use the latest version and get rid of RequestsHttpConnection as, correct me if I'm wrong, the script doesn't use any features from the requests library. If request features are used, the new RequestsHttpNode can be used instead.

Error running dga_c.predict('facebook') in Python3

Running this on my setup results in the following error:

  prediction = dga_c.predict('facebook')
 File "/usr/local/lib/python3.5/dist-packages/Flare-0.3-py3.5.egg/flare/data_science/features.py", line 248, in predict
   return self.clf.predict(_X)[0]
 File "/usr/local/lib/python3.5/dist-packages/sklearn/ensemble/forest.py", line 538, in predict
   proba = self.predict_proba(X)
 File "/usr/local/lib/python3.5/dist-packages/sklearn/ensemble/forest.py", line 578, in predict_proba
   X = self._validate_X_predict(X)
 File "/usr/local/lib/python3.5/dist-packages/sklearn/ensemble/forest.py", line 357, in _validate_X_predict
   return self.estimators_[0]._validate_X_predict(X, check_input=True)
 File "/usr/local/lib/python3.5/dist-packages/sklearn/tree/tree.py", line 373, in _validate_X_predict
   X = check_array(X, dtype=DTYPE, accept_sparse="csr")
 File "/usr/local/lib/python3.5/dist-packages/sklearn/utils/validation.py", line 441, in check_array
   "if it contains a single sample.".format(array))
ValueError: Expected 2D array, got 1D array instead:
array=[  8.           2.75        44.73284149  27.87242699].
Reshape your data either using array.reshape(-1, 1) if your data has a single feature or array.reshape(1, -1) if it contains a single sample.

I am running the latest version of flare.

Thanks in advance,
Daan

Installation

Hey Austin and Others,

Is there an installation script or installation instructions for running on CentOS7?

Thanks

Problems custom fields & Elastic 7.3.2

Hi,
Great tool but I experience some errors during execution. Could it be possible that is has something to do related to Elastic 7.3.2?
Here is my elasticsearch.ini
[beacon]
es_host=172.16.32.5
es_index=suricata-*
es_port=9200
es_timeout=480
min_occur=10
min_interval=10
min_percent=2
window=30
threads=2
period=2
kibana_version=5
verbose=True
debug=True

#Elasticsearch fields for beaconing
field_source_ip=src_ip
field_destination_ip=dst_ip
field_destination_port=dst_port
field_timestamp=@timestamp
field_flow_bytes_toserver=flow.bytes_toserver
field_flow_id=flow_id
#If using bro, may need to replace with conn
event_type=suricata

#Authentication
username=''
password=''

#set to false if using suricata defaults if you have custom fields
suricata_defaults = false
~
~
I turned on debugging and this is what I got:

flare_beacon --verbose --group --whois --focus_outbound -c /home/user/flare/configs/elasticsearch.ini -json beacon.json

[INFO] Attempting to connect to elasticsearch...
[SUCCESS] Connected to elasticsearch on 172.16.32.5:9200
[INFO] Gathering flow data... this may take a while...
[DEBUG] {'query': {'bool': {'filter': [{'bool': {'must_not': [], 'must': [{'range': {'@timestamp': {'gte': 1569262862007, 'lte': 1569270062007, 'format': 'epoch_millis'}}}]}}, {'term': {'event.type': 'suricata'}}], 'must': {'query_string': {'query': '', 'analyze_wildcard': 'true'}}}}, '_source': ['src_ip', 'dst_ip', 'dst_port', '@timestamp', 'flow.bytes_toserver', 'flow_id']}
[DEBUG] {'query': {'bool': {'filter': [{'bool': {'must_not': [], 'must': [{'range': {'@timestamp': {'gte': 1569262862007, 'lte': 1569270062007, 'format': 'epoch_millis'}}}]}}, {'term': {'event.type': 'suricata'}}], 'must': {'query_string': {'query': '
', 'analyze_wildcard': 'true'}}}}, '_source': ['src_ip', 'dst_ip', 'dst_port', '@timestamp', 'flow.bytes_toserver', 'flow_id']}
[DEBUG] @timestamp dst_ip dst_port flow flow_id src_ip
0 2019-09-23T18:22:47.707Z NaN NaN NaN NaN NaN
1 2019-09-23T18:22:27.671Z NaN NaN NaN NaN NaN
2 2019-09-23T18:23:32.295Z NaN NaN NaN NaN NaN
3 2019-09-23T18:22:07.592Z NaN NaN NaN NaN NaN
4 2019-09-23T18:23:02.180Z NaN NaN NaN NaN NaN
5 2019-09-23T18:21:21.954Z NaN NaN NaN NaN NaN
6 2019-09-23T18:28:38.549Z NaN NaN NaN NaN NaN
7 2019-09-23T18:28:33.035Z NaN NaN NaN NaN NaN
8 2019-09-23T18:28:08.512Z NaN NaN NaN NaN NaN
9 2019-09-23T18:28:13.021Z NaN NaN NaN NaN NaN
10 2019-09-23T18:30:18.890Z NaN NaN NaN NaN NaN
11 2019-09-23T18:30:08.885Z NaN NaN NaN NaN NaN
12 2019-09-23T18:29:43.262Z NaN NaN NaN NaN NaN
13 2019-09-23T18:27:48.439Z NaN NaN NaN NaN NaN
14 2019-09-23T18:29:53.272Z NaN NaN NaN NaN NaN
15 2019-09-23T18:29:23.190Z NaN NaN NaN NaN NaN
16 2019-09-23T18:29:03.105Z NaN NaN NaN NaN NaN
17 2019-09-23T18:37:50.331Z NaN NaN NaN NaN NaN
18 2019-09-23T18:37:54.840Z NaN NaN NaN NaN NaN
19 2019-09-23T18:37:00.201Z NaN NaN NaN NaN NaN
20 2019-09-23T18:35:30.029Z NaN NaN NaN NaN NaN
21 2019-09-23T18:36:24.585Z NaN NaN NaN NaN NaN
22 2019-09-23T18:34:54.367Z NaN NaN NaN NaN NaN
23 2019-09-23T18:36:40.138Z NaN NaN NaN NaN NaN
24 2019-09-23T18:34:39.942Z NaN NaN NaN NaN NaN
25 2019-09-23T18:36:14.575Z NaN NaN NaN NaN NaN
26 2019-09-23T18:37:14.725Z NaN NaN NaN NaN NaN
27 2019-09-23T18:43:01.260Z NaN NaN NaN NaN NaN
28 2019-09-23T18:43:05.915Z NaN NaN NaN NaN NaN
29 2019-09-23T18:42:41.240Z NaN NaN NaN NaN NaN
... ... ... ... ... ... ...
15767 2019-09-23T20:20:39.734Z 192.168.12.65 52346.0 NaN 1.471021e+15 192.168.12.1
15768 2019-09-23T20:19:23.909Z 192.168.12.1 53.0 NaN 1.504029e+15 192.168.12.20
15769 2019-09-23T20:20:40.477Z 192.168.12.1 53.0 NaN 1.517610e+14 192.168.12.20
15770 2019-09-23T20:19:26.224Z 192.168.12.1 53.0 NaN 1.824670e+15 192.168.12.20
15771 2019-09-23T20:20:40.489Z 192.168.12.20 51121.0 NaN 1.517610e+14 192.168.12.1
15772 2019-09-23T20:19:22.384Z 192.168.12.20 42863.0 NaN 5.343826e+14 192.168.12.1
15773 2019-09-23T20:20:42.023Z 192.168.11.1 53.0 {u'bytes_toserver': 83} 1.947489e+15 192.168.11.30
15774 2019-09-23T20:20:42.023Z 192.168.11.1 53.0 {u'bytes_toserver': 89} 2.248017e+15 192.168.11.30
15775 2019-09-23T20:19:26.237Z 192.168.12.20 46984.0 NaN 1.824670e+15 192.168.12.1
15776 2019-09-23T20:20:41.090Z 172.16.32.2 968.0 NaN 1.116387e+15 192.168.11.10
15777 2019-09-23T20:19:56.191Z 172.16.32.6 1015.0 NaN 1.859919e+15 192.168.11.10
15778 2019-09-23T20:20:48.013Z 192.168.11.1 53.0 {u'bytes_toserver': 167} 2.018723e+15 192.168.11.62
15779 2019-09-23T20:19:59.024Z 192.168.12.20 51729.0 NaN 1.504029e+15 192.168.12.1
15780 2019-09-23T20:20:45.886Z 192.168.12.20 42408.0 NaN 2.010799e+15 192.168.12.1
15781 2019-09-23T20:20:42.022Z 192.168.11.1 53.0 {u'bytes_toserver': 91} 7.555262e+14 192.168.11.30
15782 2019-09-23T20:20:51.080Z 192.168.11.1 53.0 {u'bytes_toserver': 74} 1.253790e+15 192.168.11.62
15783 2019-09-23T20:20:46.022Z 184.50.162.217 443.0 {u'bytes_toserver': 3958} 1.482827e+15 192.168.11.62
15784 2019-09-23T20:20:49.801Z 192.168.12.65 52364.0 NaN 1.964770e+15 192.168.12.1
15785 2019-09-23T20:20:47.014Z 139.162.192.13 123.0 {u'bytes_toserver': 90} 1.375954e+15 192.168.11.71
15786 2019-09-23T20:20:48.013Z 192.168.11.1 53.0 {u'bytes_toserver': 144} 7.315431e+14 192.168.11.62
15787 2019-09-23T20:20:48.013Z 192.168.11.1 53.0 {u'bytes_toserver': 97} 7.970951e+14 192.168.11.62
15788 2019-09-23T20:20:44.758Z 192.168.12.65 52353.0 NaN 1.565523e+15 192.168.12.1
15789 2019-09-23T20:20:45.885Z 192.168.12.1 53.0 NaN 2.010799e+15 192.168.12.20
15790 2019-09-23T20:20:46.145Z 192.168.12.1 53.0 NaN 7.998153e+13 192.168.12.20
15791 2019-09-23T20:20:46.523Z 192.168.12.1 53.0 NaN 5.601224e+14 192.168.12.65
15792 2019-09-23T20:20:45.886Z 192.168.12.20 46984.0 NaN 1.824670e+15 192.168.12.1
15793 2019-09-23T20:20:48.226Z 192.168.12.20 51121.0 NaN 1.517610e+14 192.168.12.1
15794 2019-09-23T20:20:51.080Z 192.168.11.255 138.0 {u'bytes_toserver': 243} 1.183719e+15 192.168.11.10
15795 2019-09-23T20:20:49.786Z 77.74.177.233 80.0 NaN 2.027704e+14 192.168.12.65
15796 2019-09-23T20:20:51.014Z 255.255.255.255 67.0 {u'bytes_toserver': 676} 2.410321e+14 0.0.0.0

[15797 rows x 6 columns]
ERROR: 'float' object has no attribute 'get'

Any help is welcome

Regard
Thierry

Error when running against Packetbeat-*

This is the message I am getting when I am running at the command line, any help would be appreciated:

$ flare_beacon --group --whois --focus_outbound -c configs/elasticsearch.ini -html beacons.html
[INFO] Attempting to connect to elasticsearch...
[SUCCESS] Connected to elasticsearch on 10.10.18.83:9200
[INFO] Gathering flow data... this may take a while...
ERROR: Elasticsearch did not retrieve any data. Please ensure your settings are correct inside the config file.

This is my selks4.ini file:

[beacon]
es_host=x.x.x.x
es_index=packetbeat-*
es_port=9200
es_timeout=480
min_occur=50
min_interval=30
min_percent=30
window=2
threads=8
period=24
kibana_version=5
verbose=true

#Elasticsearch fields for beaconing
field_source_ip=client_ip
field_destination_ip=dns.answers.data
field_destination_port=port
field_timestamp=@timestamp
field_flow_bytes_toserver=bytes_out
field_flow_id=_id

#set to false if you have custom fields
suricata_defaults = false

Flare missing @ in timestamp

Queries in latest version on not working due to missing the @ in the timestamp field for the range section. Submitting PR for this later.

Meaning of the parameter WINDOW?

When I try to adjust the parameter WINDOW, it causes inaccurate beacon detection.
I tried to set WINDOW to 2 or 5 and the result is different after running.

  1. How do I understand the parameter WINDOW?
  2. Can you tell me a little bit about what this code does? I don't really understand it.
    def percent_grouping(self, d, total):
        mx = 0
        interval = 0
        # Finding the key with the largest value (interval with most events)
        mx_key = int(max(iter(list(d.keys())), key=(lambda key: d[key])))

        mx_percent = 0.0

        for i in range(mx_key - self.WINDOW, mx_key + 1):
            current = 0
            # Finding center of current window
            curr_interval = i + int(self.WINDOW / 2)
            for j in range(i, i + self.WINDOW):
                if j in d:
                    current += d[j]
            percent = float(current) / total * 100

            if percent > mx_percent:
                mx_percent = percent
                interval = curr_interval

        return interval, mx_percent

Connection successful but no data retrieved - Suggestions for troubleshooting?

When launching Flare it appears to successfully connect to ElasticSearch, and doesn't return any results, but the error about not retrieving any data is returned instantly which makes it seem likes it not actually searching (there are gigabytes of data in the indices)

I don't, however, have Bro/Snort/Surricate in this setup, rather I'm using NetFlow data which is normalised to appropriately typed fields in a custom ES index (SIP, DIP, Dport, @timestamp)

At this stage, I can't work out how to troubleshoot this further, so any tips welcomed, please.

I'm running the beacon example as per http://www.austintaylor.io/detect/beaconing/intrusion/detection/system/command/control/flare/elastic/stack/2017/06/10/detect-beaconing-with-flare-elasticsearch-and-intrusion-detection-systems/ and can successfully retrieve this flow data via Kibana in ES.

flare_beacon -c elasticsearch.ini
[INFO] Attempting to connect to elasticsearch...
[SUCCESS] Connected to elasticsearch on localhost:9200
[INFO] Gathering flow data... this may take a while...
ERROR: Elasticsearch did not retrieve any data. Please ensure your settings are correct inside the config file.

ElasticSearch.ini (I have hacked around the values for min values and timeouts, to no avail)

[beacon]
es_host=localhost
es_index=custom-*
es_port=9200
es_timeout=200000
min_occur=5
min_interval=30
min_percent=1
window=15
threads=2
period=48
kibana_version=5
verbose=true

#Elasticsearch fields for beaconing
field_source_ip=SIP
field_destination_ip=DIP
field_destination_port=Dport
field_timestamp=@timestamp
field_flow_bytes_toserver=BytesOut
field_flow_id=CollectionSequence

#Authentication
username=''
password=''

#set to false if using suricata defaults if you have custom fields
suricata_defaults = false

Running ES "5.5.0" on CentOS7.

python -V
Python 2.7.9

pip list
argparse (1.2.1)
backport-ipaddress (0.1)
cffi (0.8.6)
chardet (2.3.0)
colorama (0.3.2)
configparser (3.5.0)
cryptography (0.6.1)
elasticsearch (5.4.0)
Flare (0.3)
gyp (0.1)
html5lib (0.999)
idna (2.6)
ipaddr (2.1.11)
lxml (3.4.0)
MarkupSafe (0.23)
ndg-httpsclient (0.3.2)
numpy (1.13.1)
pandas (0.20.3)
pip (1.5.6)
ply (3.4)
pyasn (1.5.0b7)
pyasn1 (0.1.7)
pycparser (2.10)
pyOpenSSL (0.14)
python-dateutil (2.6.1)
python-geoip (1.2)
python-geoip-geolite2 (2015.0303)
pytz (2017.2)
requests (2.4.3)
requests-file (1.4.2)
scikit-learn (0.19.0)
scipy (0.19.1)
setuptools (5.5.1)
six (1.8.0)
sklearn (0.0)
tldextract (2.0.1)
urllib3 (1.9.1)
wheel (0.24.0)
wsgiref (0.1.2)

Exemplary toolkit here btw :)

Cheers,

Chris

config file not being read?

it appears the config file is not being read, even though I'm passing in a parameter for it at the command line.

$ flare_beacon -c configs/elasticsearch.ini
[SUCCESS] Connected to elasticsearch on localhost:9200

$ grep port configs/elasticsearch.ini
es_port=9201

shouldn't the first one connect on 9201?

command_control.py SyntaxError

Hello,

I pull last version, during "python setup.py install" I noted an error and when tried to execute flare same error:
"
Traceback (most recent call last):
File "/usr/local/bin/flare_beacon", line 4, in
import('pkg_resources').run_script('Flare==0.4', 'flare_beacon')
File "/usr/lib/python3/dist-packages/pkg_resources/init.py", line 666, in run_script
self.require(requires)[0].run_script(script_name, ns)
File "/usr/lib/python3/dist-packages/pkg_resources/init.py", line 1446, in run_script
exec(code, namespace, namespace)
File "/usr/local/lib/python3.7/dist-packages/Flare-0.4-py3.7.egg/EGG-INFO/scripts/flare_beacon", line 8, in
from flare.analytics.command_control import elasticBeacon
File "/usr/local/lib/python3.7/dist-packages/Flare-0.4-py3.7.egg/flare/analytics/command_control.py", line 308
self.dprint(df)
^
SyntaxError: invalid syntax
"
My fix was to pull command_control.py from your previous branch and replace it, and that resolved my issue. No error anymore.
I did not dig in the code, but perhaps something is wrong with the last version of command_control.py or I missed something?

Thank you.

Bug/Enhancment: Elasticsearch authentication in class

It appears when using flare, if you don't provide a config file, self.auth_user and self.auth_password do not get set anywhere. Following that there is a try block that attempts to instantiate Elasticsearch() by using httpauth(self.auth_user, self.auth_pass)

I think that, should you not have a config this will never connect to ES as auth_user and auth_password would be set to None.

In our instance, we have an elasticsearch instance that is listening on localhost without creds. modifying command_control.py:145 to remove http_auth(...) did allow connection to ES without credentials.

Suggestions:

  • allow auth_user and auth_password to be called in the elasticBeacon() constructor
  • Provide a sanity check on auth credentials before calling Elasticsearch()

I'd be willing to assist and make a PR

pandas installed, but flare not seeing it

when I run flare_beacon, I get

$ flare_beacon -h
Please make sure you have pandas installed. pip -r requirements.txt or pip install pandas

but I'm pretty sure it's installed

$ pip show pandas
Name: pandas
Version: 0.20.2
Summary: Powerful data structures for data analysis, time series,and statistics
Home-page: http://pandas.pydata.org
Author: The PyData Development Team
Author-email: [email protected]
License: BSD
Location: /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages
Requires: numpy, python-dateutil, pytz

and the package is at the directory specified. Any ideas on why this might not be working?

Missing function to update ASN data

Hey,

I am trying to use Flare for some testing.
I wanted to try to create a little cronjob that updates the ASN list every day, but it seems that the feed function for the ASNHTMLParser is missing, is it possible that this hasn't been written yet?
Or am I missing a file that provides this function?

(Code fails on this line:

parser.feed(data)
)

Thank you in advance!

Reason: No option 'filter' in section: 'beacon' Error

Hi , i have recently installed Flare in my security Onion box, and I want to check the beacons against the flow data from my elastic search index. ES version = 5

I set the config.ini file as on the githun page, and matched it against Austin's medium blog too.

However I see the error that there is no Filter option in Beacon Section. I am sure Filter option is missing in my config file. Please advise what generic filter I could set there.

Thanks
Prashant

Fields in ES7 / Packetbeat

Ive been trying to set up Flare in use with ES 7.6.2 and using against packetbeat flow data to check some things.
I've had to tweak some of the hardcoding such as event_type to type and had to hardcode in my es host using the RFC1738 format to get passed a mix of SSL issues.

I have been able to get connected to the cluster but hitting the same issue every time and im struggling to find any logs that indicate what exactly is the problem, hoping someone could help give a slight steer.

[INFO] Attempting to connect to elasticsearch...
[SUCCESS] Connected to elasticsearch on 192.168.1.5:9200
[INFO] Gathering flow data... this may take a while...
[DEBUG] {'query': {'bool': {'must': {'query_string': {'query': '_exists_:source.ip AND _exists_:destination.port AND _exists_:destination.ip', 'analyze_wildcard': 'true'}}, 'filter': [{'bool': {'must': [{'range': {'@timestamp': {'gte': 1586289336488, 'lte': 1586375736488, 'format': 'epoch_millis'}}}], 'must_not': []}}, {'term': {'type': 'flow'}}]}}, '_source': ['source.ip', 'destination.ip', 'destination.port', '@timestamp', 'destination.bytes', 'network.protocol']}
[DEBUG] {'query': {'bool': {'must': {'query_string': {'query': '_exists_:source.ip AND _exists_:destination.port AND _exists_:destination.ip', 'analyze_wildcard': 'true'}}, 'filter': [{'bool': {'must': [{'range': {'@timestamp': {'gte': 1586289336488, 'lte': 1586375736488, 'format': 'epoch_millis'}}}], 'must_not': []}}, {'term': {'type': 'flow'}}]}}, '_source': ['source.ip', 'destination.ip', 'destination.port', '@timestamp', 'destination.bytes', 'network.protocol']}
[DEBUG]                        @timestamp                                        destination                   source
0        2020-04-08T08:35:00.000Z  {'port': 49909, 'bytes': 339816, 'ip': '192.16...  {'ip': '192.168.1.102'}
1        2020-04-08T08:35:00.000Z  {'port': 9200, 'bytes': 880821, 'ip': '192.168...  {'ip': '192.168.1.160'}
2        2020-04-08T08:35:00.000Z  {'port': 9200, 'bytes': 62888, 'ip': '192.168....  {'ip': '192.168.1.160'}
3        2020-04-08T08:35:00.000Z  {'port': 27021, 'bytes': 103974, 'ip': '162.25...  {'ip': '192.168.1.160'}
4        2020-04-08T08:35:00.000Z  {'port': 9200, 'bytes': 44056, 'ip': '192.168....  {'ip': '192.168.1.160'}
...                           ...                                                ...                      ...
1286164  2020-04-08T19:55:30.000Z               {'port': 123, 'ip': '185.83.169.27'}  {'ip': '192.168.1.102'}
1286165  2020-04-08T19:55:30.000Z             {'port': 15600, 'ip': '192.168.1.255'}   {'ip': '192.168.1.69'}
1286166  2020-04-08T19:55:30.000Z             {'port': 15600, 'ip': '192.168.1.255'}   {'ip': '192.168.1.87'}
1286167  2020-04-08T19:55:30.000Z           {'port': 56700, 'ip': '255.255.255.255'}  {'ip': '192.168.1.168'}
1286168  2020-04-08T19:55:30.000Z              {'port': 2054, 'ip': '192.168.1.102'}  {'ip': '192.168.1.160'}

[1286169 rows x 3 columns]
ERROR: 'destination.port'

`
I have tried tweaking the destination port to other port fields and it comes with the same error at the same point.

Config option 'use_ssl' should be retreived as boolean

I believe this config option below should be retrieved using getboolean() instead:

self.use_ssl = self.config.get('beacon', 'use_ssl')

When I set use_ssl in my config file likeuse_ssl=False, I then get the error:

ERROR: ConnectionError(HTTPSConnectionPool(host='hostname.local', port=9200): Max retries exceeded with url: /filebeat-*/_search?scroll=90m&size=1000&timeout=10m (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'ssl3_get_record', 'wrong version number')],)",),))) caused by: SSLError(HTTPSConnectionPool(host='hostname.local', port=9200): Max retries exceeded with url: /filebeat-*/_search?scroll=90m&size=1000&timeout=10m (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'ssl3_get_record', 'wrong version number')],)",),)))

Flare is using the Elasticsearch constructor and this creates an elasticsearch.Connection (parent) from elasticsearch.RequestsHttpConnection (also would apply to default type of elasticsearch.Urllib3HttpConnection even though Flare doesn't use this that I could find).

If you look at elasticsearch.connection.base.py#L49 you can see that conditional would behave correctly with a boolean but would get set to True with any valid string (even 'False').

HTTPS support with auth and last Elasticsearch version support?

HI,

As I really wanted to make it works, I started to dig and find out that https support could be implemented relatively easily, as documented here.

"SSL client authentication" and "RFC-1738 formatted URL" seam to work after I bypassed self.auth = None in the code.. this need to be adjust...

After adjusting the values in the config file to match Flow v9 from by Elasticsearch DB, in run_query(), the query: resp = helpers.scan(query=query, client=self.es, scroll="90m", index=self.es_index, timeout="10m")
Return null... df is null
From: df = pd.DataFrame([rec['_source'] for rec in resp])

And I am stuck here.. Not data retrieved.

[INFO] Attempting to connect to MY elasticsearch...
[SUCCESS] Connected to elasticsearch on IP:9200
[INFO] Gathering flow data... this may take a while...
{'query': {'bool': {'filter': [{'bool': {'must_not': [], 'must': [{'range': {'@timestamp': {'gte': 1565237742354, 'lte': 1565324142354, 'format': 'epoch_millis'}}}]}}, {'term': {'event_type': 'flow'}}], 'must': {'query_string': {'query': '*', 'analyze_wildcard': 'true'}}}}, '_source': ['src_addr', 'dst_addr', 'dst_port', '@timestamp', 'bytes', 'flowset_id']}
Empty DataFrame
Columns: []
Index: []
ERROR: Elasticsearch did not retrieve any data. Please ensure your settings are correct inside the config file.

On the server side I can see a GET with a 200, meaning auth is ok, but no data are pulled. It is followed by a DELETE, almost at the same time? weird.

I am wondering if you could make these change working?
as I am not a coder but I feel that it is feasible.

it is very frustrating, as a simple curl command like this work like a charm and retrieve the data, you will recognize the same request from Flare code:

curl --insecure -X GET -u user:password "https://IP:9200/logdb-*/_search?scroll=90m&timeout=10m&size=1000"

Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.