Burp Suite User Forum

Create new post

Why does Burp Crawler make 4 requests to every endpoint?

Hans | Last updated: Apr 01, 2019 01:51PM UTC

I'm crawling a site and have noticed that the crawler will make 4 identical requests to every endpoint. What is the purpose of this? It's an authenticated crawl and it seems to only test one URL (4 times) before logging in again (4 times) and then following each redirect and testing the same pages over and over and over again. Looking through the logs, the crawler has the the exact same GET request to the exact same endpoint over 70 times during the crawl. How is that efficient? I have 2 sets of credentials, so the max amount I would guess that should be made to a particular endpoint (during a crawl) would be 3 (once for unauthenticated, and once each for each credential). I'm not sure if this is a bug or what.

Liam, PortSwigger Agent | Last updated: Apr 01, 2019 01:53PM UTC

The crawler employs multiple crawler "agents" to parallelize its work. Each agent represents a distinct user of the application navigating around with their own browser. You can read more about this crawling technique in our online documentation: - https://portswigger.net/burp/documentation/scanner/crawling Have you tried using the "Crawl strategy - fastest" setting?

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.