Skip to content

Mining URLs from dark corners of Web Archives for bug hunting/fuzzing/further probing

License

Notifications You must be signed in to change notification settings

VIPHACKER100/ParamSpider

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ParamSpider: Parameter miner for humans

ParamSpider

Key Features:

  • Finds parameters from web archives of the entered domain.

  • Finds parameters from subdomains as well.

  • Gives support to exclude urls with specific extensions.

  • Saves the output result in a nice and clean manner.

  • It mines the parameters from web archives (without interacting with the target host)

Usage instructions:

Requires: Python 3.8+

$ git clone https://github.com/0xKayala/ParamSpider
$ cd ParamSpider
$ pip install -r requirements.txt
$ python paramspider.py --domain hackerone.com

Usage options :

1 - For a simple scan [without the --exclude parameter]
$ python paramspider.py --domain hackerone.com
-> Output ex : https://hackerone.com/test.php?q=FUZZ

2 - For excluding URLs with specific extensions
$ python paramspider.py --domain hackerone.com --exclude php,jpg,svg

3 - For finding nested parameters
$ python paramspider.py --domain hackerone.com --level high
-> Output ex : https://hackerone.com/test.php?p=test&q=FUZZ

4 - Saving the results to a file
$ python paramspider.py --domain hackerone.com --exclude php,jpg --output hackerone.txt

5 - Using a custom placeholder (default is FUZZ)
$ python paramspider.py --domain hackerone.com --placeholder CUSTOM

6 - Using quiet mode (suppress console output)
$ python paramspider.py --domain hackerone.com --quiet

7 - Exclude subdomains (default includes subdomains)
$ python paramspider.py --domain hackerone.com --subs False 

8 - Retry failed requests (e.g., for 4xx/5xx errors)
$ python paramspider.py --domain hackerone.com --retries 5

ParamSpider + GF (for massive pwnage)

Let's say you have already installed ParamSpider and now you want to filter out the juicy parameters from plethora of parameters. No worries you can easily do it using GF(by tomnomnom) .

Note : Make sure you have go properly installed on your machine .

Follow along this:

$ go get -u github.com/tomnomnom/gf
$ cp -r $GOPATH/src/github.com/tomnomnom/gf/examples ~/.gf

Note: Replace '/User/levi/go/bin/gf' with the path where gf binary is located in your system.

$ alias gf='/User/levi/go/bin/gf'
$ cd ~/.gf/

Note: Paste JSON files(https://github.com/0xKayala/ParamSpider/tree/master/gf_profiles) in ~/.gf/ folder

Now run ParamSpider and navigate to the output directory

$ gf redirect domain.txt //for potential open redirect/SSRF parameters
$ gf xss domain.txt //for potential xss vulnerable parameters
$ gf potential domain.txt //for xss + ssrf + open redirect parameters
$ gf wordpress domain.txt //for wordpress urls

[More GF profiles to be added in future]

Example :

$ python paramspider.py --domain bugcrowd.com --exclude woff,css,js,png,svg,php,jpg --output bugcrowd.txt

Note :

As it fetches the parameters from web archive data,
so chances of false positives are high.

Contributing to ParamSpider:

  • Report bugs, missing best practices
  • Shoot my DM with new ideas
  • Make more GF profiles (.json files)
  • Help in Fixing bugs
  • Submit Pull requests

My Twitter :

Say hello : 0xAsm0d3us

About

Mining URLs from dark corners of Web Archives for bug hunting/fuzzing/further probing

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 100.0%