The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

How do I do passive crawling of my proxy history retrospecitvely?

Felipe | Last updated: Mar 30, 2024 05:44PM UTC

I had my live passive crawling task set to include all URLs that came through burp, and turned off capturing for two weeks because I needed better performance, resulting in 60,000 items in the passive crawl queue at present. I decided that I now only want the items to be crawled to be within a custom-scope, so I changed the task configuration to limit the scope. However, this did not change the fact that there were thousands of items that don't adhere to that scope still queued, because the items are queued already and the configuration change only seems to apply to new requests. So, I thought 'I'll just delete the task, wiping out the queue, and create another live passive crawl task, and it will crawl over all of my history, or at least all of the history that hadn't been crawled by the last task'. This presumption was based on the fact that you can change the scope of the passive crawl, and surely if the scope is subject to change, then it would also be implemented that you can do passive crawling retrospectively to make up for items that weren't previously processed by this task due not being within the custom scope. Well, it turns out when I created a new live passive crawl, there are 0 items queued, and it does not care about anything other than new requests coming through. I suppose that can be inferred from 'live' in the task name (but this contradicts with the idea of adjustable scope, if you can adjust the scope then surely it makes no sense for it to be 'one shot')? I have been searching for a way to get this task/functionality to operate over my history, and not just new items, as I have history that was never passively crawled now and is unaccounted for in the site tree. This should be trivial, as the task is already working on requests and responses that are in the database anyway. I cannot find a way. I looked into the extension API and it does not support sending an item for passive crawling, only active crawling, active vulnerability scanning, and passive vulnerability scanning. I am so, so frustrated. I have lost weeks worth of links/endpoints that were in responses that were queued for passive crawling, and there's no way for me to invoke this functionality manually. I just can't fathom why somebody would want the option of adjusting the scope for passive crawling, but not want to be able to re-do it over previous request/response given the outcomes may be significantly different with a changed scope. Please help me achieve this.

Ben, PortSwigger Agent | Last updated: Apr 02, 2024 10:22AM UTC