The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

Crawling Task - Intercept requests in proxy

Roberto | Last updated: Dec 13, 2023 10:10AM UTC

Using Burpsuite Professional. I need to crawl and audit specific URLs without crawling the entire websites. I thus created a crawling task with a "Maximum link depth" set to zero and disabled practically all other options in the crawling configuration settings. I then created an auditing task again with all options disabled except for the specific issue I'm testing for. This all working perfectly, except for the fact that I do not want the crawling task to follow 301 and 302 redirects. I tried to add a "Match and Replace" rule in the Burp proxy settings to replace the 301/302 headers with a 200 header, but the traffic from the crawling task does not seem to be going thru the proxy. I say that only because the rule is not working, and I do not see the traffic being logged under the proxy HTTP history (it is however being logged under the main Logger tab). Is there a way to have the traffic of a Crawling and Auditing task to go thru the proxy (or to not have it follow the redirects in another way)? Thanks, Roberto

Michelle, PortSwigger Agent | Last updated: Dec 13, 2023 11:58AM UTC

Hi Burp Proxy and Burp Scanner are two separate tools within Burp Suite Professional. Traffic sent by the Scanner will not go through Burp Proxy. So that we can help you further, can you tell us more about the scope you have set for the scan and where the redirects take you? Would excluding the redirect locations from the scope of the scan be an option? If you'd prefer to share these details directly rather than on the forum, please feel free to send an email to support@portswigger.net.

Roberto | Last updated: Dec 14, 2023 04:59PM UTC

Hi Michelle, The scope is actually a wildcard ".*" as I'm testing (many, many) different websites on different domains, but I only need to test one single URL (the / for each site) for just one specific test. With so many sites the redirects can vary to redirecting to login pages, to other websites not in scope and who-know-where-else. I ended up using curl and xargs to process in parallel all the URLs, with curl options to use Burp's proxy and not follow redirects. It was however much less performant and way more prone to errors due to timeouts not being handled correctly. And in doing this I found a big issue with sending curl output to /dev/null as when concurrent tasks were doing that at the same time those processes would freeze... If there's a way to do this natively with Burp I'll take it, as for sure I've missed valid servers using my workaround. Thanks! Roberto

Michelle, PortSwigger Agent | Last updated: Dec 15, 2023 09:25AM UTC