Giter Site home page Giter Site logo

ffuf's Introduction

ffuf mascot

ffuf - Fuzz Faster U Fool

A fast web fuzzer written in Go.

Installation

  • Download a prebuilt binary from releases page, unpack and run!

    or

  • If you are on macOS with homebrew, ffuf can be installed with: brew install ffuf

    or

  • If you have recent go compiler installed: go install github.com/ffuf/ffuf/v2@latest (the same command works for updating)

    or

  • git clone https://github.com/ffuf/ffuf ; cd ffuf ; go get ; go build

Ffuf depends on Go 1.16 or greater.

Example usage

The usage examples below show just the simplest tasks you can accomplish using ffuf.

More elaborate documentation that goes through many features with a lot of examples is available in the ffuf wiki at https://github.com/ffuf/ffuf/wiki

For more extensive documentation, with real life usage examples and tips, be sure to check out the awesome guide: "Everything you need to know about FFUF" by Michael Skelton (@codingo).

You can also practise your ffuf scans against a live host with different lessons and use cases either locally by using the docker container https://github.com/adamtlangley/ffufme or against the live hosted version at http://ffuf.me created by Adam Langley @adamtlangley.

Typical directory discovery

asciicast

By using the FUZZ keyword at the end of URL (-u):

ffuf -w /path/to/wordlist -u https://target/FUZZ

Virtual host discovery (without DNS records)

asciicast

Assuming that the default virtualhost response size is 4242 bytes, we can filter out all the responses of that size (-fs 4242)while fuzzing the Host - header:

ffuf -w /path/to/vhost/wordlist -u https://target -H "Host: FUZZ" -fs 4242

GET parameter fuzzing

GET parameter name fuzzing is very similar to directory discovery, and works by defining the FUZZ keyword as a part of the URL. This also assumes a response size of 4242 bytes for invalid GET parameter name.

ffuf -w /path/to/paramnames.txt -u https://target/script.php?FUZZ=test_value -fs 4242

If the parameter name is known, the values can be fuzzed the same way. This example assumes a wrong parameter value returning HTTP response code 401.

ffuf -w /path/to/values.txt -u https://target/script.php?valid_name=FUZZ -fc 401

POST data fuzzing

This is a very straightforward operation, again by using the FUZZ keyword. This example is fuzzing only part of the POST request. We're again filtering out the 401 responses.

ffuf -w /path/to/postdata.txt -X POST -d "username=admin\&password=FUZZ" -u https://target/login.php -fc 401

Maximum execution time

If you don't want ffuf to run indefinitely, you can use the -maxtime. This stops the entire process after a given time (in seconds).

ffuf -w /path/to/wordlist -u https://target/FUZZ -maxtime 60

When working with recursion, you can control the maxtime per job using -maxtime-job. This will stop the current job after a given time (in seconds) and continue with the next one. New jobs are created when the recursion functionality detects a subdirectory.

ffuf -w /path/to/wordlist -u https://target/FUZZ -maxtime-job 60 -recursion -recursion-depth 2

It is also possible to combine both flags limiting the per job maximum execution time as well as the overall execution time. If you do not use recursion then both flags behave equally.

Using external mutator to produce test cases

For this example, we'll fuzz JSON data that's sent over POST. Radamsa is used as the mutator.

When --input-cmd is used, ffuf will display matches as their position. This same position value will be available for the callee as an environment variable $FFUF_NUM. We'll use this position value as the seed for the mutator. Files example1.txt and example2.txt contain valid JSON payloads. We are matching all the responses, but filtering out response code 400 - Bad request:

ffuf --input-cmd 'radamsa --seed $FFUF_NUM example1.txt example2.txt' -H "Content-Type: application/json" -X POST -u https://ffuf.io.fi/FUZZ -mc all -fc 400

It of course isn't very efficient to call the mutator for each payload, so we can also pre-generate the payloads, still using Radamsa as an example:

# Generate 1000 example payloads
radamsa -n 1000 -o %n.txt example1.txt example2.txt

# This results into files 1.txt ... 1000.txt
# Now we can just read the payload data in a loop from file for ffuf

ffuf --input-cmd 'cat $FFUF_NUM.txt' -H "Content-Type: application/json" -X POST -u https://ffuf.io.fi/ -mc all -fc 400

Configuration files

When running ffuf, it first checks if a default configuration file exists. Default path for a ffufrc file is $XDG_CONFIG_HOME/ffuf/ffufrc. You can configure one or multiple options in this file, and they will be applied on every subsequent ffuf job. An example of ffufrc file can be found here.

A more detailed description about configuration file locations can be found in the wiki: https://github.com/ffuf/ffuf/wiki/Configuration

The configuration options provided on the command line override the ones loaded from the default ffufrc file. Note: this does not apply for CLI flags that can be provided more than once. One of such examples is -H (header) flag. In this case, the -H values provided on the command line will be appended to the ones from the config file instead.

Additionally, in case you wish to use bunch of configuration files for different use cases, you can do this by defining the configuration file path using -config command line flag that takes the file path to the configuration file as its parameter.

Usage

To define the test case for ffuf, use the keyword FUZZ anywhere in the URL (-u), headers (-H), or POST data (-d).

Fuzz Faster U Fool - v2.1.0

HTTP OPTIONS:
  -H                  Header `"Name: Value"`, separated by colon. Multiple -H flags are accepted.
  -X                  HTTP method to use
  -b                  Cookie data `"NAME1=VALUE1; NAME2=VALUE2"` for copy as curl functionality.
  -cc                 Client cert for authentication. Client key needs to be defined as well for this to work
  -ck                 Client key for authentication. Client certificate needs to be defined as well for this to work
  -d                  POST data
  -http2              Use HTTP2 protocol (default: false)
  -ignore-body        Do not fetch the response content. (default: false)
  -r                  Follow redirects (default: false)
  -raw                Do not encode URI (default: false)
  -recursion          Scan recursively. Only FUZZ keyword is supported, and URL (-u) has to end in it. (default: false)
  -recursion-depth    Maximum recursion depth. (default: 0)
  -recursion-strategy Recursion strategy: "default" for a redirect based, and "greedy" to recurse on all matches (default: default)
  -replay-proxy       Replay matched requests using this proxy.
  -sni                Target TLS SNI, does not support FUZZ keyword
  -timeout            HTTP request timeout in seconds. (default: 10)
  -u                  Target URL
  -x                  Proxy URL (SOCKS5 or HTTP). For example: http://127.0.0.1:8080 or socks5://127.0.0.1:8080

GENERAL OPTIONS:
  -V                  Show version information. (default: false)
  -ac                 Automatically calibrate filtering options (default: false)
  -acc                Custom auto-calibration string. Can be used multiple times. Implies -ac
  -ach                Per host autocalibration (default: false)
  -ack                Autocalibration keyword (default: FUZZ)
  -acs                Custom auto-calibration strategies. Can be used multiple times. Implies -ac
  -c                  Colorize output. (default: false)
  -config             Load configuration from a file
  -json               JSON output, printing newline-delimited JSON records (default: false)
  -maxtime            Maximum running time in seconds for entire process. (default: 0)
  -maxtime-job        Maximum running time in seconds per job. (default: 0)
  -noninteractive     Disable the interactive console functionality (default: false)
  -p                  Seconds of `delay` between requests, or a range of random delay. For example "0.1" or "0.1-2.0"
  -rate               Rate of requests per second (default: 0)
  -s                  Do not print additional information (silent mode) (default: false)
  -sa                 Stop on all error cases. Implies -sf and -se. (default: false)
  -scraperfile        Custom scraper file path
  -scrapers           Active scraper groups (default: all)
  -se                 Stop on spurious errors (default: false)
  -search             Search for a FFUFHASH payload from ffuf history
  -sf                 Stop when > 95% of responses return 403 Forbidden (default: false)
  -t                  Number of concurrent threads. (default: 40)
  -v                  Verbose output, printing full URL and redirect location (if any) with the results. (default: false)

MATCHER OPTIONS:
  -mc                 Match HTTP status codes, or "all" for everything. (default: 200-299,301,302,307,401,403,405,500)
  -ml                 Match amount of lines in response
  -mmode              Matcher set operator. Either of: and, or (default: or)
  -mr                 Match regexp
  -ms                 Match HTTP response size
  -mt                 Match how many milliseconds to the first response byte, either greater or less than. EG: >100 or <100
  -mw                 Match amount of words in response

FILTER OPTIONS:
  -fc                 Filter HTTP status codes from response. Comma separated list of codes and ranges
  -fl                 Filter by amount of lines in response. Comma separated list of line counts and ranges
  -fmode              Filter set operator. Either of: and, or (default: or)
  -fr                 Filter regexp
  -fs                 Filter HTTP response size. Comma separated list of sizes and ranges
  -ft                 Filter by number of milliseconds to the first response byte, either greater or less than. EG: >100 or <100
  -fw                 Filter by amount of words in response. Comma separated list of word counts and ranges

INPUT OPTIONS:
  -D                  DirSearch wordlist compatibility mode. Used in conjunction with -e flag. (default: false)
  -e                  Comma separated list of extensions. Extends FUZZ keyword.
  -enc                Encoders for keywords, eg. 'FUZZ:urlencode b64encode'
  -ic                 Ignore wordlist comments (default: false)
  -input-cmd          Command producing the input. --input-num is required when using this input method. Overrides -w.
  -input-num          Number of inputs to test. Used in conjunction with --input-cmd. (default: 100)
  -input-shell        Shell to be used for running command
  -mode               Multi-wordlist operation mode. Available modes: clusterbomb, pitchfork, sniper (default: clusterbomb)
  -request            File containing the raw http request
  -request-proto      Protocol to use along with raw request (default: https)
  -w                  Wordlist file path and (optional) keyword separated by colon. eg. '/path/to/wordlist:KEYWORD'

OUTPUT OPTIONS:
  -debug-log          Write all of the internal logging to the specified file.
  -o                  Write output to file
  -od                 Directory path to store matched results to.
  -of                 Output file format. Available formats: json, ejson, html, md, csv, ecsv (or, 'all' for all formats) (default: json)
  -or                 Don't create the output file if we don't have results (default: false)

EXAMPLE USAGE:
  Fuzz file paths from wordlist.txt, match all responses but filter out those with content-size 42.
  Colored, verbose output.
    ffuf -w wordlist.txt -u https://example.org/FUZZ -mc all -fs 42 -c -v

  Fuzz Host-header, match HTTP 200 responses.
    ffuf -w hosts.txt -u https://example.org/ -H "Host: FUZZ" -mc 200

  Fuzz POST JSON data. Match all responses not containing text "error".
    ffuf -w entries.txt -u https://example.org/ -X POST -H "Content-Type: application/json" \
      -d '{"name": "FUZZ", "anotherkey": "anothervalue"}' -fr "error"

  Fuzz multiple locations. Match only responses reflecting the value of "VAL" keyword. Colored.
    ffuf -w params.txt:PARAM -w values.txt:VAL -u https://example.org/?PARAM=VAL -mr "VAL" -c

  More information and examples: https://github.com/ffuf/ffuf

Interactive mode

By pressing ENTER during ffuf execution, the process is paused and user is dropped to a shell-like interactive mode:

entering interactive mode
type "help" for a list of commands, or ENTER to resume.
> help

available commands:
 afc  [value]             - append to status code filter 
 fc   [value]             - (re)configure status code filter 
 afl  [value]             - append to line count filter 
 fl   [value]             - (re)configure line count filter 
 afw  [value]             - append to word count filter 
 fw   [value]             - (re)configure word count filter 
 afs  [value]             - append to size filter 
 fs   [value]             - (re)configure size filter 
 aft  [value]             - append to time filter 
 ft   [value]             - (re)configure time filter 
 rate [value]             - adjust rate of requests per second (active: 0)
 queueshow                - show job queue
 queuedel [number]        - delete a job in the queue
 queueskip                - advance to the next queued job
 restart                  - restart and resume the current ffuf job
 resume                   - resume current ffuf job (or: ENTER) 
 show                     - show results for the current job
 savejson [filename]      - save current matches to a file
 help                     - you are looking at it
> 

in this mode, filters can be reconfigured, queue managed and the current state saved to disk.

When (re)configuring the filters, they get applied posthumously and all the false positive matches from memory that would have been filtered out by the newly added filters get deleted.

The new state of matches can be printed out with a command show that will print out all the matches as like they would have been found by ffuf.

As "negative" matches are not stored to memory, relaxing the filters cannot unfortunately bring back the lost matches. For this kind of scenario, the user is able to use the command restart, which resets the state and starts the current job from the beginning.

License

ffuf is released under MIT license. See LICENSE.

ffuf's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ffuf's Issues

[Feature Request] Offset for response lengths

curl -s "https://www.facebook.com/" | wc -c
142767
curl -s "https://www.facebook.com/" | wc -c
142720
  • Sometimes response lengths can by dynamic, is there any way to set an offset value along with -fc filter to filter the response lengths with an offset value.

  • This will be problematic for both parameter fuzzing & Vhost fuzzing.

[Feature Request] Print Redirect Location

it would be very useful if you added an optional parameter to show redirection location for 30x responses

Example

domain.com    [Status: 301, Size: 0, Words: 1] -> /wordpress/blog

Check for 403 response for every request

Great tool! The fastest I've tried for directory bruting.

One request I'd like to make is a check for every response returning a 403. I've seen other tools do is if almost every response is a 403, then they quit with a warning instead of going through the entire wordlist.

Retry Connection Reset

First off, I love this tool. It is awesome. My only feature request: instead of printing "Error in runner: Get <url> ...read: connection reset by peer" and "Client.Timeout exceeded" errors, retry the attack and mark an 'error' counter with the status line. This will keep from dropping tests due to an overloaded server, and keep the output somewhat cleaner.

-input-cmd Expects FUZZ Keyword

Hi there,

Thank you for working on ffuf, it's an exciting tool! I am not sure if I am missing something, but it appears that ffuf needs the FUZZ keyword in the POST data to be defined when giving it data with -input-cmd which I guess makes sense:

image

If I also supply -d "FUZZ" in the arguments and intercept the requests, all of them are blank. It seems that -data doesn't play along with -input-cmd I believe.

Additionally, what advantage does -input-cmd offer over just creating multiple fuzz payloads with radamsa and appending them to a file and using this with -w?

PS: I tried from both Linux/Windows boxes, but only intercepted on Windows using PowerShell. Manually testing my -input-cmd command reads the file's content fine.

Error in runner while vhostscan

Hi,

while running ffuf to fuzz for virtual-hosts for one of the URL it is giving error

Error in runner: Get https://www.domain.com/: EOF

Command running with:

./ffuf -k -c -t 10 -mc 200 -w ./wordlist.txt -H 'Host: FUZZ.domain.com' -u https://www.domain.com/ 

Things I have tried:

  1. Using smaller wordlist
  2. Lowering thread counts
  3. Trying other fuzzing like parameter for same host

It is just failing for virtual-hosts fuzzing for this URL only.

Thanks

accurate filtering

Hi,
As I understood, the current behavior on scpecifying multiple filter options is to filter if a one of them is true for a response, but is there any way to filter only if all the specifed filters are true?

For example, if I specify both -fc 200 and -fs 0, I would not see in results responses with the 200 status code and any content-length and responses with 0 content-length and any status code.
But what if I need to filter only responses where status code is 200 AND content-length is 0? I cannot find such option.

Suggestion to add url-list argument

first of all you are doing an amazing tool here, second it would be a huge pros if you add argument for brute forcing a list of urls (domains) along side with -u for just single domain .

[request] stop probes on receiving a lot of 429 status code

Hi joohoi,

ffuf doesn't stop on receiving 429 status code for probes.
I my opinion the -sa flag should also look for 429 status codes and if there are x number of time its repeated then stop the probes.

This is how I am using the ffuf

./ffuf -sa -mc all -fc 400,405,406,418,444,500,501,502,503,504,508,520 -ac -c -l -k -r -w untitled.txt -u https://signup.example.com/FUZZ

image

POST Parameter Fuzzing possible false positive

POST /reflected/parameter/body HTTP/1.1
Host: public-firing-range.appspot.com
Connection: close
Cache-Control: max-age=0
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.132 Safari/537.36
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
Sec-Fetch-Site: none
Referer: https://public-firing-range.appspot.com/reflected/index.html
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9
Content-Type: application/x-www-form-urlencoded
Content-Length: 20

q=xxxxxxxxxxxxxxxxxx
  • Response
<html>
  <body>
    xxxxxxxxxxxxxxxxxx
  </body>
</html>
ffuf -w w -u https://public-firing-range.appspot.com/reflected/parameter/body -X POST -d 'FUZZ=xxxxx'

        /'___\  /'___\           /'___\       
       /\ \__/ /\ \__/  __  __  /\ \__/       
       \ \ ,__\\ \ ,__\/\ \/\ \ \ \ ,__\      
        \ \ \_/ \ \ \_/\ \ \_\ \ \ \ \_/      
         \ \_\   \ \_\  \ \____/  \ \_\       
          \/_/    \/_/   \/___/    \/_/       

       v0.11git
________________________________________________

 :: Method       : POST
 :: URL          : https://public-firing-range.appspot.com/reflected/parameter/body
 :: Matcher      : Response status: 200,204,301,302,307,401,403
________________________________________________

sdgs                    [Status: 200, Size: 38, Words: 9]
q                       [Status: 200, Size: 38, Words: 9]
dsagds                  [Status: 200, Size: 38, Words: 9]
dsgdsg                  [Status: 200, Size: 38, Words: 9]
:: Progress: [4/4] :: 0 req/sec :: Duration: [0:00:00] :: Errors: 0 ::


# cat w
dsagds
dsgdsg
q
sdgs

Suggest [ Follow The directories and Bruteforce On them In the same Command ]

I suggest you add the possibility of tracking the directories and guessing on them after the completion of the main place

Like This :

ffuf -w list.txt -u https://Site.com/FUZZ -fc 403 
......
�[2Kadmin              [Status: 301, Size: 413, Words: 46]
�[2Kuploads              [Status: 301, Size: 414, Words: 46]

.....

When this complete The Ffuf Automatically Will Brute force On /admin/ And /uploads/

AutoCalibrate seems to ignore redirects

Hello! When using the awesome -ac option, it seems to ignore 301s and 302s. They don't get added to the filters. As an example, I made a sample wordlist with:

test1
admin
stuff
othertest

Running this against http://www.irccloud.com (which has a public bounty program) returns:

ffuf -w /tmp/wordlist.txt -u http://www.irccloud.com/FUZZ -ac -x http://127.0.0.1:8080

othertest               [Status: 301, Size: 0, Words: 1]
test1                   [Status: 301, Size: 0, Words: 1]
stuff                   [Status: 301, Size: 0, Words: 1]
admin                   [Status: 301, Size: 0, Words: 1]
:: Progress: [4/4] :: 0 req/sec :: Duration: [0:00:00] :: Errors: 0 ::

In the proxy, I also see the three calibration requests, and they all return 301 with the same size and words as well. I'm going to poke around in the source a bit and see if I can figure out a fix, but go is definitely not my strongpoint!

Unsolicited response received on idle HTTP channel starting with "HTTP/1.0 408 Request Time-out\n

Faced this error so many-many-many times during scans:
:: Progress: [1499/1000000] :: 53 req/sec :: Duration: [0:00:28] :: Errors: 94 ::2019/06/14 23:06:37 Unsolicited response received on idle HTTP channel starting with "HTTP/1.0 408 Request Time-out\nCache-Control: no-cache\nConnection: close\nContent-Type: text/html\n\n<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01//EN\" \"http://www.w3.org/TR/html4/strict.dtd\">\n<html>\n<head>\n\t<title>Down for Maintenance 408</title>\n\t<meta name=\"viewport\" content=\"initial-scale=1.0, width=device-width, maximum-scale=1.0, user-scalable=no\" />\n\t<script src=\"//ajax.googleapis.com/ajax/libs/jquery/1.8.3/jquery.min.js\"></script>\n\t<link rel=\"stylesheet\" type=\"text/css\" href=\"//ds.phncdn.com/www-static/css/maintenance.css?cache=2017072823\" />\n\n <script type=\"text/javascript\">\n (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){\n (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),\n m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)\n })(window,document,'script','//www.google-analytics.com/analytics.js','ga');\n ga('create', 'UA-2623535-1', 'pornhub.com');\n ga('require', 'displayfeatures');\n ga('send', 'pageview');\n </script>\n</head>\n<body>\n\t<div id=\"PornhubNetworkBar\"></div>\n\t<div id=\"maintenancePageResponsive\">\n\t\t<div class=\"titleMaintenance clearfix\">\n\t\t\t<div class=\"topContainer clearfix\">\n\t\t\t\t<div class=\"phImage\"><img src=\"//ds.phncdn.com/www-static/images/pornhub_logo.png?cache=2017072823\" /></div>\n\t\t\t\t<div class=\"textMaintenance\"><h1>Is undergoing maintenance</h1></div>\n\t\t\t</div>\n\t\t</di"; err=<nil>

[Feature Request] Support multi-proxy.

Hi!

Thanks for developing such a great tool. Would be great to support multiple proxies, as wfuzz does for example.

Not sure how complex would be, but HTTP and SOCKS5 proxies and cycling each request by a different proxy would be great.

No output file when the command does not finish the entire wordlist

Ffuf does not write to the designated output file unless the command finishes.

Considering wordlists can be rather large and canceling them before completion is a regular thing, the tool should write the output file as it runs, in case something happens and the command is unable to finish.

[Feature request] Recursive fuzzing

It would be awesome to have a recursive option, with --recursive for example, with an option to specify the depth.
Every time we have a 200 response on a folder, ffuf would automatically scan the subfolders after the initial scan.

Suggestion to output full urls, not only words found

It will be very useful to have the ability to output not only words found, like
api about static
but full urls like
https://github.com/ffuf/ https://github.com/api/ https://github.com/about/
Because in saved csv-reports there is no scan query or so present, so it will be awesome to see full urls in every of hundreds of report-files saved, to see from what target it comes from.

[request] filter results by blacklist instead of whitelist

Current logic

If resp.Status matches one of the statuses defined, the entry is valid (match-only)

Desired logic

If resp.Status does not match any of the blacklisted statuses, the entry is valid (match-all)

Sane defaults: 404,410,502,503

getting "too many open files" few times

Hi @joohoi,

Thank you for working on this, just noticed this error for few times now with random targets.

ffuf -t 50 -fs 0 -k -mc 200 -w word.txt -u https://test.site.com/FUZZ

Error in runner: Get https://test.site.com//config/locales/ja.yml: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//config/locales/simple_form.en.yml: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//config/newrelic.yml: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.php.save: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configs/: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.jsp: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.php: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.php.bak: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.php.dist: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.php.old: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//connect.php: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.php.txt: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration.php~: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configuration/: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//configure/: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//conflg.php: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//console/login/LoginForm.jsp: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//console/: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files
Error in runner: Get https://test.site.com//content/: dial tcp: lookup test.site.com on 10.1.94.9:53: dial udp 10.1.94.9:53: socket: too many open files

max time feature

It would be nice to have an feature similar to curl's --max-time option to bail out if ffuf has been running for more than N seconds.

[Feature Request] Support sessions.

Hi!

As a first stage, a simple parameter into .json output that takes the last line sent would be great to support resuming sessions.

In an ideal world, storing the captured data and being able to reprocess it as wfpayload does in wfuzz would be awesome.

I'm just writing ideas, don't expect anything to be implemented, I really appreciate what you guys are doing.

Thanks!

[Feature Request] URL Encoding Option

Hello,

It would be nice if ffuf would have an option to URL encode FUZZ input as many lists contain spaces or other special characters.

It could accept a list of characters to auto-URL encode for instance:

-url-encode " #&"

Hello, Honorable Author

Hello Fuff Author, I just want to display and export requests with specified strings in the response. What should I write? Trying to use - Mr does not seem to work, and requests that do not contain a "test" string in the response are also shown.

[Improvement] Collect data for each request to avoid data loss.

The output seems to be written at the end of the discovery, leading to a loss of data if the program is stopped manually.

Sometimes could be interesting to grep partial data and use it to readjust a new fuzzing session.

For now, an approach to don't loose findings is to split up in two steps using ffuf -s | tee output and then running < output | ffuf -w -.

Thanks :)!

[request] Allow custom autocalibrate, and more than once.

Now calibration is made using randomstr/ but fails when FUZZ is in a non-directory context. E. g.

echo file.php | ffuf -u https://www/example.com/FUZZ -c -ac -w -                                                                        ✘ 1
Error in autocalibration, exiting: Get https://www/example.com/adminXVlBzgbaiCMRAjWw/: dial tcp: lookup www: Temporary failure in name resolution

Would be an improvement if -ac allows a list of calibrations, a command could be:

-ac 'adminXVlBzgbaiCMRAjWw/','adminXVlBzgbaiCMRAjWw.jWw','adminXVlBzgbaiCMRAjWw.php'

Or with multiple parameters:

-ac 'adminXVlBzgbaiCMRAjWw/' -ac 'adminXVlBzgbaiCMRAjWw.jWw' -ac 'adminXVlBzgbaiCMRAjWw.php'

Certificate validation error

Hey, when I run ffuf for some hosts I got the error showed below
Unfortunately I can't share the exaclty host.
When I run it without -ac flag I just get tons of error (error for each request)

./ffuf -c -k -w common.txt -mc all -mr ".* " -t 80 -u 'https://mydomain.omega.bf1.yahoo.com/FUZZ' -ac -k
Error in autocalibration, exiting: Get https://mydomain.omega.bf1.yahoo.com/adminXVlBzgbaiCMRAjWw/: x509: certificate is valid for *.media.yahoo.com, *.data.yahoo.com, *.fantasysports.yahoo.com, *.finance.yahoo.com, *.flurry.com, *.fp.yahoo.com, *.geo.yahoo.com, *.io.yahoo.net, maw.ouroath.com, *.maw.ouroath.com, *.media.yql.yahoo.com, *.mobile.yahoo.com, *.paas.ec.yahoo.com, *.protrade.com, *.publishing.oath.com, *.query.yahoo.com, *.query.yahooapis.com, *.search.yahoo.com, *.sports.yahoo.com, *.video.yahoo.com, *.yahoo.com, *.yql.yahoo.com, *.yql.yahooapis.com, *.canary1-bf1.media.yahoo.com, *.canary1-gq1.media.yahoo.com, *.canary1-ir2.media.yahoo.com, *.canary1-ne1.media.yahoo.com, *.canary1-sg3.media.yahoo.com, *.canary1-tp2.media.yahoo.com, *.canary1-tw1.media.yahoo.com, *.prod1-bf1.media.yahoo.com, *.prod1-gq1.media.yahoo.com, *.prod1-ir2.media.yahoo.com, *.prod1-ne1.media.yahoo.com, *.prod1-sg3.media.yahoo.com, *.prod1-tp2.media.yahoo.com, *.prod1-tw1.media.yahoo.com, *.alpha1-bf1.media.yahoo.com, *.beta1-bf1.media.yahoo.com, *.stage1-bf1.media.yahoo.com, actservices.oath.com, adspecs.oath.com, *.adspecs.oath.com, *.adspecs.yahoo.com, *.admanagerplus.yahoo-inc.com, admanagerplus.yahoo-inc.com, *.admanagerplus.yahoo.com, *.datatables.org, *.entertainment.yahoo.com, query.yahooapis.com, subs.communications.yahoo.com, *.virtualstudio.yahoo.com, *.yax.yahoo.com, *.yql.yimg.com, api.lps.cp.yahoo.com, *.vads.yahoo.com, *.developer.yahoo.com, ad.com, *.ad.com, *.ad.yahoo.com, adspecs.verizonmedia.com, *.adspecs.verizonmedia.com, *.verizonmedia.com, *.canary1-bf1.omega.yahoo.com, *.canary1-gq1.omega.yahoo.com, *.canary1-ir2.omega.yahoo.com, *.canary1-ne1.omega.yahoo.com, *.canary1-sg3.omega.yahoo.com, *.canary1-tp2.omega.yahoo.com, *.canary1-tw1.omega.yahoo.com, *.prod1-bf1.omega.yahoo.com, *.prod1-gq1.omega.yahoo.com, *.prod1-ir2.omega.yahoo.com, *.prod1-ne1.omega.yahoo.com, *.prod1-sg3.omega.yahoo.com, *.prod1-tp2.omega.yahoo.com, *.prod1-tw1.omega.yahoo.com, *.alpha1-bf1.omega.yahoo.com, *.beta1-bf1.omega.yahoo.com, *.stage1-bf1.omega.yahoo.com, *.test1-gq1.omega.yahoo.com, *.digits.vzbuilders.com, *.vzbuilders.com, *.v2-canary1-gq1.omega.yahoo.com, *.v2-canary1-bf1.omega.yahoo.com, *.v2-canary1-ne1.omega.yahoo.com, *.v2-canary1-ir2.omega.yahoo.com, *.v2-canary1-sg3.omega.yahoo.com, *.v2-canary1-tp2.omega.yahoo.com, *.v2-canary1-tw1.omega.yahoo.com, *.v2-prod1-gq1.omega.yahoo.com, *.v2-prod1-bf1.omega.yahoo.com, *.v2-prod1-ne1.omega.yahoo.com, *.v2-prod1-ir2.omega.yahoo.com, *.v2-prod1-sg3.omega.yahoo.com, *.v2-prod1-tp2.omega.yahoo.com, *.v2-prod1-tw1.omega.yahoo.com, *.tp.vzbuilders.com, *.mmi.vzbuilders.com, *.tripod.yahoo.com, not mydomain.omega.bf1.yahoo.com

Support %ext%

Hi, I see you are just replacing %EXT%, some wordlists using %ext% too.

[Request] Extensions without %EXT%

Thanks for your work, FFuF is now my favorite tool for Web discovery/fuzzing.

Would it be possible to allow -e without -D and %EXT% in the wordlist ?

I have the wordlist Web-Content/common.txt who doesn't contains extensions or %EXT%.
For exemple fuuf -w common.txt -e .php,.html -u https://target/FUZZ would test login.php and login.html.
And maybe implement -E who would try with and without the extension, so login.php, login.html AND login.

Saving the output in md

can you alter the order of column fields while saving in md format

FUZZ | URL | Redirectlocation | Position | Status Code | Content Length | Content Words | Content Lines |

this is the default order

Status Code | Content Length | Content Words | Content Lines | FUZZ | URL | Redirectlocation | Position |

if you change it into this the output will be more pretty as first 4 columns will be be straight and the structure will be more easier to look at.
Let me know what you think

v0.10 seems to be broken in Windows

v0.9 (386 version) worked fine in Windows
v0.10 does not start

Program 'ffuf.exe' failed to run: Access is deniedAt line:1 char:1

Also it seems that since v0.10 Windows considers ffuf to be dangerous.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.