The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

Crawl & audit very slow, never finishes (Burp pro 2020.9)

Corch | Last updated: Sep 05, 2020 08:29AM UTC

I am pretty new to Burp pro, so I don't have a lot of experience to know whether this is expected behavior or not. I am hoping some more experience users can give me some insight here... I am testing a wordpress based site, it runs on a dedicated VPS. Burp is running on a VM on my laptop with 4 CPU cores and 8GB of RAM. Traffic is traversing a VPN over a 50/20Mbps link but I can get max throughput normally with the VPN so no issues there. I launched a scan of the site using default crawl & audit policies on Friday. The crawler got up to about 900 requests and 100ish unique locations with a reasonable amount of time remaining on the clock, I think it got down to 2 mins to go at one point. Then suddenly it just jumped up to "5 hours remaining" and stayed there for a long time, and the speed of the crawl slowed down dramatically from about 2 requests per second to 1 every 5 seconds. Sometimes it would just hang, the number of requests wouldn't go up for minutes at a time. I got on with some other stuff and let it do its thing in the back ground, checking on it occasionally. Sometimes it would seem like the scanner was hung, making no requests for a long time and I would try pausing and restarting which seemed to bring it back to life, albeit still very slowly. I noticed that the time remaining would jump around, sometimes going down to an hour or so, then it would climb back up to 5h. I did have logger++ running at the start and I could see that it seemed like it was making many requests to the same things over and over, like every page that had external jquery on it or google analytics was causing the same links to get hit thousands of times. I ended up turning logger++ off because I saw a post suggesting it slowed things down. I tried to edit the scope for the scan but you can't once it is kicked off, so I tried adding the target host to the scope tab and excluding things like google analytics, but that made no difference to the on-going scan. Eventually I gave up and paused the scan then shut the project down and decided to try again. By this point the scan had taken up most of the day. This time I defined the scope at the start and I changed the crawl profile to "faster", and set a max time limit of 180 minutes. Again it seemed to work fairly efficiently at the start but once it got to around 1000 requests it slowed down to a crawl again. The time out didnt work, I checked after 4 hours and the crawl was still running. It did end about 20 minutes later due to the time out, and moved onto the audit stage, which I hoped would be shorter. Well that was a vain hope, the audit's first estimate for time to completion was over 2 days for a site with only 278 unique locations. I left it running over night, and when I checked this morning the estimate was up to "more than 10 days". It still says that now. I can see in the "audit items" tab of the view window that it is progressing... veeeery slowly, but the implications frighten me - for the 278 locations, it has completed 2/2 phases of passive scanning for all, 1/3 phases of javascript scanning, and is only half way through the 1st of 5 phases for active scanning. It is not even close to half way finished yet and has been running for more than 24 hours. I can see which locations it is currently testing and watch the number of requests go up, but it's only making a few every 30 seconds This seems absurd to me for a fairly simple wordpress site, so the first thing I want to know is - Am I doing something wrong, or is this normal and expected behavior when using default settings? Has anyone else noticed the same behavior or run into the same problem where the scanner just keeps adding more work? Are the crawler and scanner really so slow? I find it hard to believe that is the case because frankly if this is the expected performance of the scanner then it is 100% useless for real world applications. I have tried adjusting the settings to exclude more things like adding more regexes to the scope exclude list, and adding all the google analytics cookies to the insertion point exclude list (because they are on every damn page)settings more - the only problem is that it doesn't seem like any changes made to an active scan take effect, I can see burp is still making requests to those urls and testing those cookies on every page... My second question is what can I do to speed things up? I tried creating a resource pool with more concurrent request, it didn't seem to make much difference once the scanner slows down. From what I've seen so far, it also seems like you need to be pretty specific with scope exclude lists to weed out things like google analytics and external jquery calls, same goes for insertion points to avoid testing GA and other 3rd party tracking cookies, etc. Are there some test types / parameters / etc I should just avoid because they are slow? Also, I do have the "additional scanner checks" plugin, which I guess would add to the run time but it seems like they are mostly passive? I'd be greatful if people could share their experiences with the crawl & audit functions? Are they worth the hassle or should I just stick to manual crawling and testing?

Uthman, PortSwigger Agent | Last updated: Sep 07, 2020 10:22AM UTC

Hi, It does not look like you are doing anything wrong but the speed of the scan is dependent on a range of factors. This includes networking errors (e.g. latency, bandwidth, connection errors), the size of the application being scanned, the system resources available on your machine, etc... Is there a WAF that could be slowing the scanner down? Any rate-limiting on the server? You mentioned that the scan starts off fast but seems to slow down later. Which version of Burp are you using? Can you please email us on support@portswigger.net so we can assist you further?

Gianluca | Last updated: Oct 22, 2020 02:40PM UTC

Same exact problem here. Tried creating a new project and the crawler gets stuck anyway and seems to be issuing one request a time. Meanwhile our API is correctly called by the browser emulation but in the sitemap just / got logged after hundreds of different connections (checked with wireshark and app logs). Is this a version specific bug? Shall i downgrade?

Hannah, PortSwigger Agent | Last updated: Oct 23, 2020 08:41AM UTC

Hi Gianluca Could you tell me the version of Burp that you are using? Do you have any extensions installed? If so, could you try disabling all of them and see if that causes any improvements in your scan? Are you using a temporary project file or a disk-based project file? Are you using the default scan configuration or a custom one?

nick | Last updated: Oct 26, 2020 04:03PM UTC

i was experiencing the same situation with the slow/never-going-to-end scans and unloading/removing some extensions seemed to help speed the scan up. i have not determined which extension was causing the issues yet.

Hannah, PortSwigger Agent | Last updated: Oct 27, 2020 10:48AM UTC

Hi Nick If you do find the extension causing the issue then please let us know so we can look into this further.

Vinay | Last updated: Jan 16, 2021 02:55AM UTC

We are running into very similar issue in some of our environments (Windows 7 and Windows 10) where Burp "Active scan" takes extreme long to finish. After spending days to figure it out, so far we haven't got any lead regarding the cause of this issue. We are using Burpsuite Professional, and seen such issue with verrsion 2.1.07 as well as the latest release 2020.12.1. It would be good to hear from community, if they found any solution/work-around such issue. Thanks, Vinay

Richard | Last updated: Jun 13, 2022 10:03AM UTC

I can confirm this issue. It's happened since a recent update... Since seeing one of the user's comments to remove extensions. I loaded up burp suite and removed ALL extensions and re-scanned on "https://ginandjuice.shop/" Now requests are FLYING through at the normal rate expected and not 1-2 requests a minute.. So this is an issue with an extension and the latest update. It would be great for the team to test before updates are sent out. QA testing would be beneficial in-house. To update, Removing ALL the plugins and manually adding them all back has resolved the slow issue. My initial thoughts are a possible issue in the update/upgrade process which breaks some compatibility in requests in Burp Pro. Speed has returned to normal after the manual removal and re-add. Product Improvement Request: Have the ability to save plugins and reload/refresh all at once.

Hannah, PortSwigger Agent | Last updated: Jun 13, 2022 12:45PM UTC