The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

Burp Suite Enterprise - Crawling Issues

Vincent | Last updated: Apr 16, 2020 10:09AM UTC

Hi tech support, I am currently using Burp Suite Enterprise Version: 2020.2-3025, I am in the process of scanning through our own web urls to discover any potential vulnerabilities. However as simple as it may be, it appears that the scanner is not able to crawl and discover the sub-directories in the folder path. It was found out that the scanner was able to detect some of the login page and populate them with the credentials created in the profile. I am certainly not sure what is causing this. I have done some reading that the robots.txt could be preventing the scanner from crawling deeper into the directories. However I have tried to create a custom rules under Misc section and uncheck "get robots.txt" but still the scanner was not able to crawl deeper into the directories which contains other urls.

Uthman, PortSwigger Agent | Last updated: Apr 16, 2020 10:23AM UTC

Hi Vincent, What scan configuration are you using for the crawl? Have you tried a crawl-only configuration (e.g. Crawl strategy - most complete)?

Vincent | Last updated: Apr 16, 2020 12:53PM UTC

Hi Uthman, I have tried both configuration but still the scanner was not able to discover the directory files. I have to manually include the url location myself. However the example below is not the exhaustive list. urls: https://xxx.xxx.xxx/ https://xxx.xxx.xxx/api/location 1) Audit Checks - Medium Active + Crawl Strategy - more complete + Minimize false positive 2) Crawl strategy - most complete

Uthman, PortSwigger Agent | Last updated: Apr 16, 2020 01:14PM UTC