Information Gathering
Passive reconnaissance
➡️ Physical engagement / Social engineering
Location information like
satellite images
drone recon
building layout (badge readers, security, fencing, etc)
Job information
employees (name, job title, phone number, etc)
pictures (badge photos, desk photos, computer photos, etc)
➡️ Web / Host Assessment
target validation
whois,nslookup,dnsrecon
finding subdomains
Google,
dig,nmap,crt.sh, etc
fingerprinting
nmap,wappalyzer,netcat, etc
data breaches
HaveIBeenPwned, Breach-Parse, WeLeakInfo
Target
❗ Always refer to a Bug Bounty program to find valid targets that can be legally tested
🔗 Bugcrowd
🧪
e.g.- Tesla
Read the program details, follow the terms and stay in scope
Following test will be made on the
*.tesla.comtarget
Discovering email addresses
The goal is discovering public email addresses and check if they really exist
➡️ Hunter.io (free registration) - Find email addresses from any company name or website

➡️ Phonebook.cz (free registration) - Phonebook lists all domains, email addresses, or URLs for the given input domain
➡️ VoilaNorbert
➡️ Clearbit Connect (Chrome extension)
➡️ EmailHippo Email address verifiy - Free email address verification tool
Breached credentials
➡️ HaveIBeenPwned - Check if your email address is in a data breach
➡️ breach-parse - A tool for parsing breached passwords
BreachCompilationpassword list (44GB) file comes from breached password dumps
breach-parse @tesla.com tesla.txt "~/Downloads/BreachCompilation/data"Credential stuffing and Password spraying can be done using the results.
➡️ DeHashed.com (subscription) - public data search-engine
Hashed passwords or other data can be found
Collect all the data (email, username, IP, address, etc) with the goal to find patterns, that could be related to personal accounts too
Investigation to tie the data to other accounts, etc
Use tools to try to decrypt the hashed password, like Hashes.com, Google, etc

Hunting subdomains
Identify subdomains
➡️ Sublist3r (outdated) - enumerate subdomains of websites using OSINT
sudo apt install sublist3rsublist3r -d tesla.com
sublist3r -d tesla.com -t 100 -v➡️ crt.sh - look for registered certificates and find subdomains or sub-subdomains

➡️ amass - in-depth attack surface mapping and asset discovery
sudo apt install amassamass enum -d tesla.com
amass enum -d syselement.com

➡️ httprobe - take a list of domains and probe for working (alive) http and https servers
# Go is necessary (installed via pimpmykali.sh)
go install github.com/tomnomnom/httprobe@latest
# or on Kali
sudo apt install httprobecat tesla.com/recon/final.txt | httprobe
# Skip default probes, and use only https:443 probe
cat tesla.com/recon/final.txt | httprobe -s -p https:443
# Strip only subdomains from the list
cat tesla.com/recon/final.txt | sort -u | httprobe -s -p https:443 | sed 's/https\?:\/\///' | tr -d ':443'➡️ assetfinder - find domains and subdomains related to a given domain
# Go is necessary (installed via pimpmykali.sh)
go get -u github.com/tomnomnom/assetfinder
# or on Kali
sudo apt install assetfinderassetfinder syselement.com
assetfinder --subs-only tesla.comScreenshotting websites
➡️ gowitness - A golang, web screenshot utility using Chrome Headless
# Go is necessary (installed via pimpmykali.sh)
go install github.com/sensepost/gowitness@latest
# or on Kali
sudo apt install gowitnessgowitness scan single --url "https://tesla.com" --write-db
gowitness scan single --url "https://blog.syselement.com"Website technologies
➡️ BuiltWith.com - find out what websites are built with

➡️ Wappalyzer.com - via browser extension
by visiting the webpage, interact with the browser extension to check the website technologies

➡️ WhatWeb
whatweb https://blog.syselement.com/
Automated recon script
Little
bashscript for sub-domains hunting
#!/bin/bash
url=$1
if [ ! -d "$url" ]; then
mkdir $url
fi
if [ ! -d "$url/recon" ]; then
mkdir $url/recon
fi
# Assetfinder #
echo "[+] Harvesting subdomains with assetfinder..."
assetfinder $url >> $url/recon/assets.txt
# get only subdomains containing $url
cat $url/recon/assets.txt | grep $1 >> $url/recon/final.txt
rm $url/recon/assets.txt
# Amass #
# echo "[+] Harvesting subdomains with amass..."
# amass enum -d $url >> $url/recon/f.txt
# sort -u $url/recon/f.txt >> $url/recon/final.txt
# rm $url/recon/f.txt
# httprobe #
echo "[+] Probing for alive domains..."
cat $url/recon/final.txt | sort -u | httprobe -s -p https:443 | sed 's/https\?:\/\///' | tr -d ':443' >> $url/recon/alive.txt
###➡️ sumrecon - web recon script
wget https://raw.githubusercontent.com/Gr1mmie/sumrecon/refs/heads/master/sumrecon.shTCM's modified final script
Creates a directory structure for reconnaissance under a given URL
Harvests subdomains using
assetfinderFilters valid subdomains and saves them to
final.txtChecks for live domains using
httprobeIdentifies potential subdomain takeovers using
subjackScans for open ports using
nmapScrapes archived URLs from
waybackurlsExtracts parameters from Wayback Machine data
Categorizes JavaScript, PHP, JSON, JSP, and ASPX files from Wayback Machine data
Removes temporary files to keep the structure clean
(Commented out) Could run
amassfor subdomain discovery and useEyeWitnessfor screenshots
# 0. Requirements
sudo apt install amass assetfinder httprobe gowitness nmap subjack
go install github.com/tomnomnom/waybackurls@latest
# 1. Copy the code here https://pastebin.com/raw/MhE6zXVt to a new file
# 2. Fix last 2 lines with gowitness and uncomment them
# echo "[+] Running eyewitness against all compiled domains..."
# gowitness scan file -f $url/recon/httprobe/alive.txt
chmod +x finalrecon.sh
./finalrecon.sh syselement.com
Check those additional resources
The Bug Hunter's Methodology - The Bug Hunter's Methodology Full 2-hour Training by Jason Haddix
Using Burpsuite
➡️ Burp Suite

Google Fu
➡️ Google.com
site:tesla.com filetype:pdfSocial Media
Linkedin, Twiter (X) or other public websites can be used for some social media OSINT (Open-Source Intelligence).
Last updated
Was this helpful?