Giter Site home page Giter Site logo

bloodhound-import's People

Contributors

arvchristos avatar daddycocoaman avatar dirkjanm avatar jarilaos avatar jazofra avatar jeffmcjunkin avatar mwgielen avatar paradoxis avatar vruello avatar wh1tenoise avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bloodhound-import's Issues

bloodhound import fails on importing computers in ubuntu 20

I have the following configuration
ubuntu20
python 3

I have installed the following
pip3
neo4j driver
running the latest bh 3.0.5
running the latest sharphound

When running bloodhound-import -du neo4j -dp "password" /my/path/*.json, I am getting the following error

`[INFO] 2020-10-06 18:10:19,775 - Parsing 90 files

[INFO] 2020-10-06 18:10:19,775 - Parsing bloodhound file: /opt/import/20201005T0000019512/20201005000002_computers.json

Traceback (most recent call last):
File "/usr/local/bin/bloodhound-import", line 8, in
sys.exit(main())

File "/usr/local/lib/python3.8/dist-packages/bloodhound_import/init.py", line 44, in main
parse_file(filename, driver)

File "/usr/local/lib/python3.8/dist-packages/bloodhound_import/importer.py", line 437, in parse_file
session.write_transaction(parse_function, entry)

File "/home/user/.local/lib/python3.8/site-packages/neo4j/work/simple.py", line 403, in write_transaction
return self._run_transaction(WRITE_ACCESS, transaction_function, *args, **kwargs)

File "/home/user/.local/lib/python3.8/site-packages/neo4j/work/simple.py", line 309, in _run_transaction
result = transaction_function(tx, *args, **kwargs)

File "/usr/local/lib/python3.8/dist-packages/bloodhound_import/importer.py", line 230, in parse_computer
queries.append(Query(query, dict(source=entry['MemberId'], target=identifier)))

KeyError: 'MemberId'
`

bloodhound-import fails on ubuntu 16.04

running bloodhound-import fails with the following

File "/usr/local/bin/bloodhound-import", line 9, in <module> load_entry_point('bloodhound-import==0.0.7', 'console_scripts', 'bloodhound-import')() File "/usr/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 542, in load_entry_point return get_distribution(dist).load_entry_point(group, name) File "/usr/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2569, in load_entry_point return ep.load() File "/usr/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2229, in load return self.resolve() File "/usr/lib/python2.7/dist-packages/pkg_resources/__init__.py", line 2235, in resolve module = __import__(self.module_name, fromlist=['__name__'], level=0) File "build/bdist.linux-x86_64/egg/bloodhound_import/__init__.py", line 4, in <module> File "/usr/local/lib/python2.7/dist-packages/bloodhound_import-0.0.7-py2.7.egg/bloodhound_import/importer.py", line 14 query: str

An existing connection was forcibly closed by the remote host

When uploading large json files (for example 500k groups), I sometimes get a "An existing connection was forcibly closed by the remote host". Please note that the database is running on localhost, so this is not due to connection problems.

Traceback (most recent call last):
File "c:\python27\lib\runpy.py", line 174, in run_module_as_main
"main", fname, loader, pkg_name)
File "c:\python27\lib\runpy.py", line 72, in run_code
exec code in run_globals
File "C:\Python27\Scripts\bloodhound-import.exe_main
.py", line 9, in
File "c:\python27\lib\site-packages\bloodhound_import_init
.py", line 40, in main
parse_file(filename, driver)
File "c:\python27\lib\site-packages\bloodhound_import\importer.py", line 408, in parse_file
session.write_transaction(parse_function, entry)
File "c:\python27\lib\site-packages\neo4j_init_.py", line 363, in exit
self.close()
File "c:\python27\lib\site-packages\neo4j_init_.py", line 396, in close
self.rollback_transaction()
File "c:\python27\lib\site-packages\neo4j_init_.py", line 652, in rollback_transaction
self.disconnect(sync=True)
File "c:\python27\lib\site-packages\neo4j_init
.py", line 380, in _disconnect
self._connection.sync()
File "c:\python27\lib\site-packages\neobolt\direct.py", line 505, in sync
self.send()
File "c:\python27\lib\site-packages\neobolt\direct.py", line 394, in send
self._send()
File "c:\python27\lib\site-packages\neobolt\direct.py", line 409, in _send
self.socket.sendall(data)
File "c:\python27\lib\ssl.py", line 759, in sendall
v = self.send(data[count:])
File "c:\python27\lib\ssl.py", line 725, in send
v = self._sslobj.write(data)
socket.error: [Errno 10054] An existing connection was forcibly closed by the remote host

Is it possible to add more robustness to the import?

bloodhound import fails on importing new sharphound input

Sharphound with loop is creating multiple computers and users files and the script fails at importing new files

Traceback (most recent call last):
File "/home/import/.local/bin/bloodhound-import", line 8, in
sys.exit(main())
File "/home/import/.local/lib/python3.8/site-packages/bloodhound_import/init.py", line 44, in main
parse_file(filename, driver)
File "/home/import/.local/lib/python3.8/site-packages/bloodhound_import/importer.py", line 437, in parse_file
session.write_transaction(parse_function, entry)
File "/usr/local/lib/python3.8/dist-packages/neo4j/work/simple.py", line 403, in write_transaction
return self._run_transaction(WRITE_ACCESS, transaction_function, *args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/neo4j/work/simple.py", line 309, in _run_transaction
result = transaction_function(tx, *args, **kwargs)
File "/home/import/.local/lib/python3.8/site-packages/bloodhound_import/importer.py", line 298, in parse_group
tx.run(entry.query, props=entry.properties)
File "/usr/local/lib/python3.8/dist-packages/neo4j/work/transaction.py", line 118, in run
result._tx_ready_run(query, parameters, **kwparameters)
File "/usr/local/lib/python3.8/dist-packages/neo4j/work/result.py", line 57, in _tx_ready_run
self._run(query, parameters, None, None, None, **kwparameters)
File "/usr/local/lib/python3.8/dist-packages/neo4j/work/result.py", line 101, in _run
self._attach()
File "/usr/local/lib/python3.8/dist-packages/neo4j/work/result.py", line 202, in _attach
self._connection.fetch_message()
File "/usr/local/lib/python3.8/dist-packages/neo4j/io/_bolt4.py", line 363, in fetch_message
response.on_failure(summary_metadata or {})
File "/usr/local/lib/python3.8/dist-packages/neo4j/io/_common.py", line 179, in on_failure
raise Neo4jError.hydrate(**metadata)
neo4j.exceptions.ConstraintError: {code: Neo.ClientError.Schema.ConstraintValidationFailed} {message: Node(17478) already exists with label Group and property name = 'policy name@domain'}

neo4j.exceptions.ClientError

Hi, I am trying to import the JSON file but, I am getting an exception:

Traceback (most recent call last):
  File "/usr/local/bin/bloodhound-import", line 11, in <module>
  load_entry_point('bloodhound-import==0.0.4', 'console_scripts', 'bloodhound-import')()
  File "/usr/local/lib/python3.8/dist-packages/bloodhound_import-0.0.4-py3.8.egg/bloodhound_import/__init__.py", line 36, in main
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/work/simple.py", line 474, in write_transaction
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/work/simple.py", line 408, in _run_transaction
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/work/simple.py", line 580, in _close
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/work/simple.py", line 553, in sync
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/work/simple.py", line 267, in sync
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/io/_bolt3.py", line 379, in fetch_all
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/io/_bolt3.py", line 325, in fetch_message
  File "/usr/local/lib/python3.8/dist-packages/neo4j_driver-4.0.0a4-py3.8.egg/neo4j/io/_bolt3.py", line 511, in on_failure
  neo4j.exceptions.ClientError: There already exists an index :User(name). A constraint cannot be created until the index has been dropped.

Bug - Parsing function for object type: azure was not found

Hi everyone,

working currently with bloodhound and bloodhound-import to dump azure ad into bloodhound. Unfortunately i get the following error message:

Parsing function for object type: azure was not found

Either i had to add some variable for azure into the parsing_map or did i miss something ? Could you assist on that matter ?

Thank you in advance

Data import taking long time

It would be helpful if you can tweak the scripts by adding few extra lines of code to device constrains and indexes for a faster data import.

Also request you to make the script compatible with bloodhound latest version

Two NoneType object is not iterable found

When parsing the output of bloodhound I get two distinct 'NoneType' object is not iterable

(most recent call last):
File "C:\Python27\Scripts\bloodhound-import-script.py", line 11, in
load_entry_point('bloodhound-import==0.0.2', 'console_scripts', 'bloodhound-import')()
File "c:\python27\lib\site-packages\bloodhound_import_init_.py", line 40, in main
parse_file(filename, driver)
File "c:\python27\lib\site-packages\bloodhound_import\importer.py", line 402, in parse_file
session.write_transaction(parse_function, entry)
File "c:\python27\lib\site-packages\neo4j_init_.py", line 708, in write_transaction
return self.run_transaction(WRITE_ACCESS, unit_of_work, *args, **kwargs)
File "c:\python27\lib\site-packages\neo4j_init
.py", line 674, in _run_transaction
result = unit_of_work(tx, *args, **kwargs)
File "c:\python27\lib\site-packages\bloodhound_import\importer.py", line 220, in parse_computer
create_computer_query(tx, computer, query, 'DcomUsers', 'ExecuteDCOM')
File "c:\python27\lib\site-packages\bloodhound_import\importer.py", line 89, in create_computer_query
for entry in computer[value]:
TypeError: 'NoneType' object is not iterable

Traceback (most recent call last):
File "C:\Python27\Scripts\bloodhound-import-script.py", line 11, in
load_entry_point('bloodhound-import==0.0.2', 'console_scripts', 'bloodhound-import')()
File "c:\python27\lib\site-packages\bloodhound_import_init_.py", line 40, in main
parse_file(filename, driver)
File "c:\python27\lib\site-packages\bloodhound_import\importer.py", line 402, in parse_file
session.write_transaction(parse_function, entry)
File "c:\python27\lib\site-packages\neo4j_init_.py", line 708, in write_transaction
return self.run_transaction(WRITE_ACCESS, unit_of_work, *args, **kwargs)
File "c:\python27\lib\site-packages\neo4j_init
.py", line 674, in _run_transaction
result = unit_of_work(tx, *args, **kwargs)
File "c:\python27\lib\site-packages\bloodhound_import\importer.py", line 342, in parse_domain
process_ace_list(tx, domain['Aces'], name, "Domain")
File "c:\python27\lib\site-packages\bloodhound_import\importer.py", line 40, in process_ace_list
for entry in ace_list:
TypeError: 'NoneType' object is not iterable

import failes when importing *computers.json

bloodhound-import -du "neo4j" -dp "password" ~/Upload/1/20190206114521_computers.json

please see the error message below:

[INFO] 2019-02-06 13:12:25,419 - Parsing 1 files
[INFO] 2019-02-06 13:12:25,420 - Parsing bloodhound file: /home/users/Upload/1/20190206114521_computers.json
Traceback (most recent call last):
File "/home/user/.local/bin/bloodhound-import", line 11, in
sys.exit(main())
File "/home/user/.local/lib/python3.5/site-packages/bloodhound_import/init.py", line 40, in main
parse_file(filename, driver)
File "/home/user/.local/lib/python3.5/site-packages/bloodhound_import/importer.py", line 402, in parse_file
session.write_transaction(parse_function, entry)
File "/home/user/.local/lib/python3.5/site-packages/neo4j/init.py", line 708, in write_transaction
return self._run_transaction(WRITE_ACCESS, unit_of_work, *args, **kwargs)
File "/home/user/.local/lib/python3.5/site-packages/neo4j/init.py", line 674, in _run_transaction
result = unit_of_work(tx, *args, **kwargs)
File "/home/user/.local/lib/python3.5/site-packages/bloodhound_import/importer.py", line 220, in parse_computer
create_computer_query(tx, computer, query, 'DcomUsers', 'ExecuteDCOM')
File "/home/user/.local/lib/python3.5/site-packages/bloodhound_import/importer.py", line 89, in create_computer_query
for entry in computer[value]:
TypeError: 'NoneType' object is not iterable

Import performance - is this normal?

Despite the fact that I am running the import from the same server as Neo4j, the import is much slower than the UI. Has anybody else observed times like these? It takes nearly three hours to import 9,000 users.

[INFO] 2021-03-08 11:50:29,205 - Parsing 1 files
[INFO] 2021-03-08 11:50:29,205 - Parsing bloodhound file: E:\Bloodhound\JSON\20210308103715_users.json
[INFO] 2021-03-08 12:02:38,419 - Parsed 973 out of 9736 records in E:\Bloodhound\JSON\20210308103715_users.json.
. . . .
[INFO] 2021-03-08 13:19:03,274 - Parsed 5838 out of 9736 records in E:\Bloodhound\JSON\20210308103715_users.json.
. . . 
[INFO] 2021-03-08 14:37:55,007 - Parsed 9730 out of 9736 records in E:\Bloodhound\JSON\20210308103715_users.json.
[INFO] 2021-03-08 14:38:04,210 - Completed file: E:\Bloodhound\JSON\20210308103715_users.json
[INFO] 2021-03-08 14:38:04,210 - Done

Doesn't parse *_ous.json files

Probably needs a parse_ous function.

Current situation:

$ bloodhound-import -du neo4j -dp neo4j filename_ous.json
[WARNING] 2019-01-22 17:28:53,360 - Parsing 2 files
[WARNING] 2019-01-22 17:28:53,361 - Parsing bloodhound file: filename_ous.json
Traceback (most recent call last):
  File "/usr/local/bin/bloodhound-import", line 9, in <module>
    load_entry_point('bloodhound-import==0.0.1', 'console_scripts', 'bloodhound-import')()
  File "build/bdist.linux-x86_64/egg/bloodhound_import/__init__.py", line 40, in main
  File "build/bdist.linux-x86_64/egg/bloodhound_import/importer.py", line 321, in parse_file
KeyError: u'ous'

Does the import clear/reset the existing DB?

I run the SharpHound scans on a scheduled basis and will use this tool to update the neo4j DB after each scan.

Will the tool clear out existing object from the DB or should I attach a new/empty copy of the DB each time?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.