blacklanternsecurity / bbot Goto Github PK
View Code? Open in Web Editor NEWA recursive internet scanner for hackers.
Home Page: https://www.blacklanternsecurity.com/bbot/
License: GNU General Public License v3.0
A recursive internet scanner for hackers.
Home Page: https://www.blacklanternsecurity.com/bbot/
License: GNU General Public License v3.0
Tried installing from pipx and got this:
root@box:~# pipx install bbot
ERROR: Could not find a version that satisfies the requirement bbot (from versions: none)
ERROR: No matching distribution found for bbot
Anything I can do on my end?
A fallback should be provided so that if BBOT does not have an internet connection, tldextract does not prevent the scan from running.
Output modules seem to ignore the watched_events value, as well as the filter_event method. To specify a particular set of events, currently it is necessary to explicitly check event.type against self.watched_events like this:
def handle_event(self, event):
if event.type in self.watched_events:
do_stuff()
This bug causes new events' scope distance to be calculated incorrectly if the event's source is identical to itself.
Sometimes the viewdns module produces garbage data. It would be helpful if we could find a way to filter this. If filtering out the garbage data is not possible, it should be removed from the subdomain-enum group.
Documentation should be added explaining how to install dependencies for all modules without running an actual scan.
In version 2.0.6, -host must be explicitly used to specify the target
When using the Python API, the new config is not properly mirrored into the OS' environment variables.
Certain ProjectDiscovery tools like httpx
use hardcoded DNS servers including 8.8.8.8
. This behavior is not ideal. System nameservers should be used instead so that BBOT can be leveraged for internal scans, etc.
However there is currently a bug in httpx
that prevents DNS from working on certain nameservers. We have opened an issue for this.
Once this bug is fixed, we need to ensure that we are passing through the system's nameservers via -r
to nuclei
, naabu
, and httpx
, etc.
Add functionality to the Cloud Helper function to tag appropriate returned events as "Cloud" resources when discovered (e.g., AWS/Azure/Google Cloud/CloudFlare IPs, DNS, Websites, Endpoints)
When handling large amounts of duplicate events, the scan appears to hang. Reproduceable with this command:
bbot -c web_spider_distance=2 web_spider_depth=2 -t www.hackredcon.com -m httpx
It is unknown why these UNVERIFIED_URL
events take so long to process, since they should not be undergoing any type of DNS resolution etc.
The tldsextract library is causing SSL verify errors even with sslverify=False in config when in an ssl-proxied (mitm) environment
As shown in the stack trace when operating in an environment where all traffic is being intercepted this module makes a call via requests to https://publicsuffix.org/list/public_suffix_list.dat
and https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat
. These calls do not respect the sslverify=False setting which the rest of bbot respects (such as web request helper).
At the end of the stack trace, --current-config was run to show that ssl_verify is currently false.
Issue appears to be rooted in the tldsextract library.
root@c0965812342f:/opt/bbot# poetry run bbot -d -m httpx -t https://icanhazip.com
[INFO] bbot.cli: Command: bbot -d -m httpx -t https://icanhazip.com
[DBUG] bbot.core.event: Autodetected event type "URL" based on data: "https://icanhazip.com"
[INFO] bbot.scanner: Loading 1 modules: httpx
[INFO] bbot.scanner: Loaded module "httpx"
[SUCC] bbot.scanner: Loaded 1 modules
[INFO] bbot.scanner: Starting scan asdf
[INFO] bbot.scanner: Setting up modules
[DBUG] bbot.modules.httpx: Setting up module httpx
[DBUG] bbot.modules.httpx: Finished setting up module httpx
[INFO] bbot.scanner: Finished setting up modules
[INFO] bbot.scanner: Target: Event("URL", "https://icanhazip.com")
{"type": "URL", "data": "https://icanhazip.com", "module": "module", "source": "f762108ca727f0bcd961fc7467c248887dfa70dc:TARGET", "id": "baed85c983e6564d59e4ca4cbf62df4d78d3492d:URL", "tags": ["target"]}
[INFO] bbot.scanner: Starting modules
[INFO] bbot.scanner: 1 modules started
Exception reading Public Suffix List url https://publicsuffix.org/list/public_suffix_list.dat
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 190, in run_and_cache
result = self.get(namespace=namespace, key=key_args)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 93, in get
raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}"
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 190, in run_and_cache
result = self.get(namespace=namespace, key=key_args)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 93, in get
raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}"
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 386, in _make_request
self._validate_conn(conn)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 1040, in _validate_conn
conn.connect()
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connection.py", line 414, in connect
self.sock = ssl_wrap_socket(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/local/lib/python3.9/ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "/usr/local/lib/python3.9/ssl.py", line 1040, in _create
self.do_handshake()
File "/usr/local/lib/python3.9/ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/adapters.py", line 440, in send
resp = conn.urlopen(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 785, in urlopen
retries = retries.increment(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='publicsuffix.org', port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/suffix_list.py", line 30, in find_first_response
return cache.cached_fetch_url(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 199, in cached_fetch_url
return self.run_and_cache(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 192, in run_and_cache
result = func(**kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 209, in _fetch_url
response = session.get(url, timeout=timeout)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/sessions.py", line 542, in get
return self.request('GET', url, **kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/sessions.py", line 529, in request
resp = self.send(prep, **send_kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/sessions.py", line 645, in send
r = adapter.send(request, **kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/adapters.py", line 517, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='publicsuffix.org', port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))
Exception reading Public Suffix List url https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 190, in run_and_cache
result = self.get(namespace=namespace, key=key_args)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 93, in get
raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}"
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 190, in run_and_cache
result = self.get(namespace=namespace, key=key_args)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 93, in get
raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'}"
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 386, in _make_request
self._validate_conn(conn)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 1040, in _validate_conn
conn.connect()
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connection.py", line 414, in connect
self.sock = ssl_wrap_socket(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/local/lib/python3.9/ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "/usr/local/lib/python3.9/ssl.py", line 1040, in _create
self.do_handshake()
File "/usr/local/lib/python3.9/ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/adapters.py", line 440, in send
resp = conn.urlopen(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 785, in urlopen
retries = retries.increment(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/suffix_list.py", line 30, in find_first_response
return cache.cached_fetch_url(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 199, in cached_fetch_url
return self.run_and_cache(
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 192, in run_and_cache
result = func(**kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/tldextract/cache.py", line 209, in _fetch_url
response = session.get(url, timeout=timeout)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/sessions.py", line 542, in get
return self.request('GET', url, **kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/sessions.py", line 529, in request
resp = self.send(prep, **send_kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/sessions.py", line 645, in send
r = adapter.send(request, **kwargs)
File "/root/.cache/pypoetry/virtualenvs/bbot-IFSyk-JB-py3.9/lib/python3.9/site-packages/requests/adapters.py", line 517, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))
[VERB] bbot.scanner: Events queued: 0 (None)
[VERB] bbot.scanner: Tasks queued: 0 (None)
[DBUG] bbot.modules.httpx: Handling batch of 1 events for module "httpx"
[SUCC] bbot.scanner: Scan asdf completed with status FINISHED
root@c0965812342f:/opt/bbot# poetry run bbot --current-config
modules:
httpx:
allow_skip_portscan: true
nuclei: {}
sublist3r: {}
sslcert:
timeout: 4.0
dnsresolve:
max_hosts: 65536
naabu: {}
aspnet_viewstate: {}
dnsx:
wordlist: https://raw.githubusercontent.com/danielmiessler/SecLists/master/Discovery/DNS/subdomains-top1million-20000.txt
wayback: {}
dnsdumpster: {}
max_threads: 250
http_proxy: null
http_timeout: 30
ssl_verify: false
user_agent: Mozilla/5.0 (iPhone; CPU iPhone OS 13_2_3 like Mac OS X) AppleWebKit/605.1.15
(KHTML, like Gecko) Version/13.0.3 Mobile/15E148 Safari/604.1
dns_wildcard_tests: 5
url_extension_blacklist:
- png
- jpg
- jpeg
- gif
- svg
- css
- woff
- woff2
- ttf
root@c0965812342f:/opt/bbot#
After disabling TCP DNS due to it being unsupported on Ubuntu, DNS requests often time out even on valid hosts and valid DNS servers, especially when massdns is running. There should be a relatively low timeout value set for DNS requests (e.g. 5 seconds) and a reasonable number of retries per query (e.g. 3).
When the temp directory is removed at the end of a scan, this causes problems if there is another BBOT scan running at the same time.
When you apt install chromium-browser
on Ubuntu, it silently installs via a snap instead 🤬. Gowitness is unable to use the snap-installed chromium because of its isolated home directories, and because snap requires a daemon, this means it cannot be installed in a docker container. We need to find a way to install chromium on Ubuntu without using snap.
When modules use self.warning()
or self.error()
to log messages, this helper should check whether there is an active exception and if so automatically log.debug()
the traceback. This will allow us to delete a lot of unnecessary module code that uses traceback.format_exc()
.
Hey, I'm looking to contribute to this project.
Can you please help me get started on the bugs or features I could work on, and point me to discord/slack where I could interact with and ask questions to other contributors?
The following features would improve usability for BBOT's Python API:
scan.start()
should yield events so that the JSON output module is not neededIt would be nice to add 1 or 2 retries to httpx by default.
When BBOT is set to proxy through Burpsuite, it seems that httpx
discovers HTTPS URLs but not HTTP ones.
The README should include documentation on how scope distance works and how targets can also be files.
Create a module that will support the identification of vulnerable subdomains that could be victim of a take over
https://medium.com/@nynan/what-i-learnt-from-reading-217-subdomain-takeover-bug-reports-c0b94eda4366
Examples Projects:
Under certain conditions wildcard domains will produce DNS_NAMEs like this: _wildcard._wildcard._wildcard.evilcorp.com
The extractor based nuclei templates fail to capture the extracted data. In most case this should be folded into the description.
As discussed in #40, trying a blank sudo password has the potential to lock out accounts or generate failed login alerts when run in a hardened environment. The testing pipeline should set the BBOT_SUDO_PASS
environment variable to an empty string to avoid coding this behavior into the tool itself.
If this behavior is confusing (for example, bbot prompts the user for a Sudo password when their account does not require one to escalate), supplement the prompt with "or press enter for none".
If the tool needs to be able to test for an empty password, consider running a command like the following to check for sudo rights without generating a failed logon:
SUDO_ASKPASS=/bin/false /bin/sudo -An /bin/true &> /dev/null
Nuclei Budget mode calculations can take 30 seconds+ on some systems. We should implement some kind of a caching system to avoid this when possible
In order to avoid compiling, BBOT pulls the compiled version of blacklist3r. However, the key file has been updated in-between release dates. We need to write an ansible rule specifically for grabbing the most up-to-date MachineKeys.txt file.
When a dependency is installed, its hash is stored so that later it can be verified that it has been properly installed. But sometimes dependencies are installed despite the cached value.
Massdns occasionally generates false positives for wildcard domains
It would be nice for wordlists to be specified as dependencies (instead of in module.setup()
) so that they can be installed with --install-all-deps
.
This warning would indicate to the user that DNS is probably blocked outbound, and that they either need to fix that or choose to disable the massdns module.
The auto-versioning feature is no longer working. We should consider using Poetry to publish.
[ERRR] bbot.scanner.manager: Error in ScanManager._emit_event(): 6841513254754510330
When running:
bbot -f subdomain-enum -m httpx -t evilcorp.com -n evilcorp -o . -c modules.massdns.max_resolver=5000
Update:
2022-10-06 04:56:17,807 [DEBUG] bbot.modules.dnscommonsrv base.py:543 Traceback (most recent call last):
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/modules/base.py", line 425, in _event_postcheck
if not self.filter_event(event):
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/modules/dnscommonsrv.py", line 100, in filter_event
is_wildcard, _ = self.helpers.is_wildcard(event.host)
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/core/helpers/dns.py", line 530, in is_wildcard
query_ips = self.resolve(query, type=("A", "AAAA"), retries=retries, cache_result=True)
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/core/helpers/dns.py", line 86, in resolve
raw_results, errors = self.resolve_raw(query, **kwargs)
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/core/helpers/dns.py", line 122, in resolve_raw
r, e = self._resolve_hostname(query, rdtype=t, **kwargs)
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/core/helpers/dns.py", line 150, in _resolve_hostname
results = self._dns_cache[dns_cache_hash]
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/core/helpers/cache.py", line 128, in __getitem__
return self.get(item)
File "/root/.local/pipx/venvs/bbot/lib/python3.9/site-packages/bbot/core/helpers/cache.py", line 76, in get
return self._cache[name_hash]
KeyError: 8714145451267874800
aspnet_viewstate module is occasionally throwing an error when HTTPX does not include a response-body, as shown below.
[ERRR] bbot.scanner.manager: Error in aspnet_viewstate.handle_event(): 'response-body'
[DBUG] bbot.scanner.manager: Traceback (most recent call last):
File "/home/xxxxxxxxxx/.local/lib/python3.9/site-packages/bbot/scanner/manager.py", line 266, in catch
ret = callback(*args, **kwargs)
File "/home/xxxxxxxxxxx/.local/lib/python3.9/site-packages/bbot/modules/aspnet_viewstate.py", line 48, in handle_event
generator_match = self.generator_regex.search(event.data["response-body"])
KeyError: 'response-body'
This is not leading to false negatives.
Need to improve error handling.
Running the SSL cert module constantly produces these errors:
DBUG] bbot.modules.sslcert: Error with SSL handshake on xxx.xxx.xxx.xxx port 80: 'float' object is not callable
This does not appear to create any false positives or false negatives but may be introducing a performance impact.
This error has been encountered when cancelling a scan with the vhost module enabled.
[ERRR] bbot.core.helpers.command: Error in _feed_pipe(): [Errno 24] Too many open files: '/root/.bbot/temp/v8gh2nm1k1bok0hnmwdk
We are seeing "potential sqli parameter" events for non-in-scope URLs (github etc.)
In preparation for writing cloud enumeration modules such as storage bucket enumeration, etc., it may be helpful to have a "cloud helper" that stores useful information for each of the major cloud providers, e.g. AWS, Azure, Google, Digital Ocean, etc.
These helpers would inherit from a base class and store information like what the secrets look like and which domains are used for storage buckets etc. so that excavate
could dynamically extract cloud-provider-specific goodies.
The helpers would also be used by modules to store any information that may be useful across multiple modules.
The ASN module should fall back to another service instead of displaying warning every time it fails.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.