Bug Bounty Methodology – How to Approach a Target

There are many people who are new to Bug Bounty. Most of them are stuck, What to do, What the First thing they should perform. I will say there is no first thing or no best method. Everyone has different mentality so your approach. The things which work for me may not for you. I will tell you my way to approach the target.

Recon

You already know that information gathering is the most important aspect of hacking the same applies to a bug bounty, But for me, I do recon till the time I don’t understand the application or find something interesting. It could be information disclosure on google or Github, Javascript also but I touch javascript when I actually start using the application.

Subdomain

A subdomain is the most important part to look for I have already written some blogs for the subdomain recon, but what I personally do is I use sublist3r, amass, asset finder, crt.sh and subfinder I have my bash script which basically finds subdomain for all these tools and save all the unique subdomain in a subdomain.txt. Which I feel quite good because it takes around 5 to 10 mins depending on the target and it’s automated.




Google Dork and Github

Google dork is a simple way and something gives you information disclosure. Files which I look for are bak,old,sql,xml,conf,ini,txt etc. you can simply use site:example.com ext:txt. For Github recon, I will suggest you watch GitHub recon video from bug crowd.

Wayback Machine

Wayback machine is useful to find some URL and pages which you can find now but is still working and most important parameters. I use both automated and manual approach for Wayback machine, For automated I use waybackurls tool from tomnomnom. I simply use cat subdomain.txt | waybackurls | tee waybackurls.txt This will give me Wayback URLs from all the subdomain I have and will save it in a file. After that, I manually check for the changes in the application over the times.

After that, I find some common URL with parameters or contain some words like cat waybackurls.txt | grep -I 'url=' like url= I search for admin, /?, redirect= etc

Subdomain Takeover

The next thing I look is there any chance of subdomain takeover. I personally haven’t found any subdomain takeover but still, I look for it. What I do is I do it manually and through automation tools. First, I use subjack for automation subjack -w subdomain.txt -v -a -ssl. It will scan for subdomain take over.

another thing I do is, I scan all the subdomain for their IP address and DNS recode and save it in a CSV file. and then I check manually which is quite a time-consuming. For this scan I one more bash script. For more information about subdomain take over check this can I take over XYZ.

Screenshot

Aquatone is the tool I use for screenshot and go through all the domains cat subdomain.txt.txt | httprobe --prefer-https | aquatone after the scan is over I go through it and look for something.




Port Scanning

Port scanning with service is also important sometimes one domain has multiple web services on multiple ports. If you are testing https://example.com that on port 443 but if you find out example.com:8081 also has web running, Or any other service which is vulnerable. I told you above I have a script that saves IP address in CSV file.

So I have one more script that takes all the IP address and scan for ports. Mostly I scan for 80 443 8080 21 22. then look for services and version on that ports. I found one vulnerability on FTP and got RCE. Some people scan for SSH like password attack etc. You can use Mass scan masscan -p21,22,80,443,8080

Directory Fuzzing

There are various tools for directory fuzzing you just need a good wordlist for it. You can use seclist, for tools I use dirsearch and dirbuster both, I use burp suite proxy in both the tool. What I mean is, for example, in dirsearch python3 dirsearch.py -u example.com -w wordlist.txt -e PHP,json,xml -f -r --random-agents --plain-text-report=dirsearch.txt --http-proxy=localhost:8080 with help of proxy every request will go through my burp suite so burp suite will generate a site map for me. There is one tool named fuff which is quite famous I haven’t used so I can’t tell but will check after some days




Extention which I look for are PHP, XML, JSON, ASP, ASPX, HTML and TXT.

I have my own wordlist, I created them in multiple ways. For example from robots.txt, I have created one wordlist from all the targets I have tested. So basically I have robots.txt file which has paths from all the application I have tested, and from other ways. But lets me tell you how you can create a wordlist form robots.txt.

User-agent: *
Disallow: /admin
Dsallow: /users
Allow: /image/xyz.jpg

This atypical robots.txt what I do is, I sort them with some Linux command curl -s https://example/robots.txt | grep -i 'disallow' | cut -d ":" -f 2 |
sort -u | tee -a robots.txt
. Curl will send a request to the domain, and from the output, we will capture disallow path than with the cut we are removing disallow and just getting the path and tee will save it.

Parameter Finding with Arjun

Arjun is a great tool which helps you to find parameters in GET and POST request. Many times I found some paths, python3 arjun.py -u example.com/users --get this is a very basic example which I use many times.

Javascript endpoint finder

Javascript contains endpoints and some times these endpoints are redirected to admin or some sensitive location. you can use JSParser.

Note

I do directory fuzzing and parameter finding soo many times it’s not just once. even after testing the actual application if I found any path which I feel could be interesting then I do scanning for that path and search for directory and parameters.

There are soo many other things in my recon like virtual host discovery, Netcraft, Shodan etc. Its all depend on the application.

Testing the actual application

Till now I have a good site map in my burp suite still I try to visit the site manually, then I do two things.

  1. I check for waybackurls.txt file and manually visit them not all URLs but some interesting one.
  2. Analysis of the sitemap for some interesting URLs like admin, upload, possible idor, API, parameters etc.




Note

I save everything that looks interesting like path, parameters, token etc in a sperate file. then check them, again and again, to think about what I can do with them, anything possible.

SSRF and Open Redirect

If I found any URL where I can do Open redirect or SSRF basically I check for parameters like redirect, URL, rdir etc. it’s all about analysis which is manually you can’t do it with tools. There are tools for SSRF and Open Redirect finding but I don’t use them. this is just a basic look, I look for SSRF and Redirect even after login and after hours of testing. Obviously some new paths will popup after login and some more research.

Account TakeOver and IDOR

This is the common thing in my testing, I always look for account takeover or related bugs like, Forget Password poisoning, No rate limit, Rest token, Registration, Password/Email/number change etc. Some points I like to do (can’t tell everything because most of the account takeover bugs are logical and based on the application)

  • I register an account with an already registered email address if fail try to bypass it.
  • Try to change the password for some else account from my account
  • Bypass current password option in change email/password
  • change email with victim address which is already registered
  • delete someone else accounts or any document if any

XXE XSS File Upload

Once I am done with account takeover I look for XXE, Now xxe can be found during registration also. then I try to find XSS. when I am done with this I try to upload any file. File upload can be used to find XXE, XSS, RCE etc.

CSRF and Cross-Origin

The Cross-Origin is not very easy for me to find. Still, I look for it but for that l look for my sitemap and my map where I have inserting paths.

  • Try to find path like password change email change or other depending on the application for CSRF
  • If CSRF token then try to bypass it like only valid token is required, POST to GET
  • Remove CSRF token, a null byte.
  • Cross-Origin to capture CSRF token or other information.




This is it for my testing I look for SQL Injection SSTI, SMTP Injection and Command Injection also. This is not the exact thing or methods you should follow blindly because there are soo many bugs depends on the application. Same for the methodology, Use it as a guide, not as the exact answer.

If you are someone Who doesn’t need methodology right now but want to start and need guide how to start then check out my How to Get started with Bug Bounty blog

TOOLS

Subdomain
Wayback Machine
SubDomain Takeover




Screenshot
Port Scanning
Fuzzing Web Discovery
Parameter Finding




Javascript endpoint

For More tools related info check

Follow us

Leave a Reply