Mapping Attack Surface - Crawling/Spider

Crawling or spidering an application is important for exploring its attack surface. This process involves enumerating the structure of the web application, including its navigation and content.

# Crawling One Site
gospider --site $domain -o gospider-output.txt -c 10 --blacklist png,jpg,css,gif,svg,mp4,mov,avi,wmv,mp3,wav,ogg,woff,woff2

# Crawling a pool of sites in TXT file
gospider --sites scope.txt -o gospider-output.txt -c 10 --blacklist png,jpg,css,gif,svg,mp4,mov,avi,wmv,mp3,wav,ogg,woff,woff2

# Crawling an Active Session
gospider --site $domain --cookie "<COOKIE>"

echo $domain | katana -silent

Last updated

Was this helpful?