Burp Suite User Forum

Login to post

Crawl & audit very slow, never finishes (Burp pro 2020.9)

Corch | Last updated: Sep 05, 2020 08:29AM UTC

I am pretty new to Burp pro, so I don't have a lot of experience to know whether this is expected behavior or not. I am hoping some more experience users can give me some insight here... I am testing a wordpress based site, it runs on a dedicated VPS. Burp is running on a VM on my laptop with 4 CPU cores and 8GB of RAM. Traffic is traversing a VPN over a 50/20Mbps link but I can get max throughput normally with the VPN so no issues there. I launched a scan of the site using default crawl & audit policies on Friday. The crawler got up to about 900 requests and 100ish unique locations with a reasonable amount of time remaining on the clock, I think it got down to 2 mins to go at one point. Then suddenly it just jumped up to "5 hours remaining" and stayed there for a long time, and the speed of the crawl slowed down dramatically from about 2 requests per second to 1 every 5 seconds. Sometimes it would just hang, the number of requests wouldn't go up for minutes at a time. I got on with some other stuff and let it do its thing in the back ground, checking on it occasionally. Sometimes it would seem like the scanner was hung, making no requests for a long time and I would try pausing and restarting which seemed to bring it back to life, albeit still very slowly. I noticed that the time remaining would jump around, sometimes going down to an hour or so, then it would climb back up to 5h. I did have logger++ running at the start and I could see that it seemed like it was making many requests to the same things over and over, like every page that had external jquery on it or google analytics was causing the same links to get hit thousands of times. I ended up turning logger++ off because I saw a post suggesting it slowed things down. I tried to edit the scope for the scan but you can't once it is kicked off, so I tried adding the target host to the scope tab and excluding things like google analytics, but that made no difference to the on-going scan. Eventually I gave up and paused the scan then shut the project down and decided to try again. By this point the scan had taken up most of the day. This time I defined the scope at the start and I changed the crawl profile to "faster", and set a max time limit of 180 minutes. Again it seemed to work fairly efficiently at the start but once it got to around 1000 requests it slowed down to a crawl again. The time out didnt work, I checked after 4 hours and the crawl was still running. It did end about 20 minutes later due to the time out, and moved onto the audit stage, which I hoped would be shorter. Well that was a vain hope, the audit's first estimate for time to completion was over 2 days for a site with only 278 unique locations. I left it running over night, and when I checked this morning the estimate was up to "more than 10 days". It still says that now. I can see in the "audit items" tab of the view window that it is progressing... veeeery slowly, but the implications frighten me - for the 278 locations, it has completed 2/2 phases of passive scanning for all, 1/3 phases of javascript scanning, and is only half way through the 1st of 5 phases for active scanning. It is not even close to half way finished yet and has been running for more than 24 hours. I can see which locations it is currently testing and watch the number of requests go up, but it's only making a few every 30 seconds This seems absurd to me for a fairly simple wordpress site, so the first thing I want to know is - Am I doing something wrong, or is this normal and expected behavior when using default settings? Has anyone else noticed the same behavior or run into the same problem where the scanner just keeps adding more work? Are the crawler and scanner really so slow? I find it hard to believe that is the case because frankly if this is the expected performance of the scanner then it is 100% useless for real world applications. I have tried adjusting the settings to exclude more things like adding more regexes to the scope exclude list, and adding all the google analytics cookies to the insertion point exclude list (because they are on every damn page)settings more - the only problem is that it doesn't seem like any changes made to an active scan take effect, I can see burp is still making requests to those urls and testing those cookies on every page... My second question is what can I do to speed things up? I tried creating a resource pool with more concurrent request, it didn't seem to make much difference once the scanner slows down. From what I've seen so far, it also seems like you need to be pretty specific with scope exclude lists to weed out things like google analytics and external jquery calls, same goes for insertion points to avoid testing GA and other 3rd party tracking cookies, etc. Are there some test types / parameters / etc I should just avoid because they are slow? Also, I do have the "additional scanner checks" plugin, which I guess would add to the run time but it seems like they are mostly passive? I'd be greatful if people could share their experiences with the crawl & audit functions? Are they worth the hassle or should I just stick to manual crawling and testing?

Uthman, PortSwigger Agent | Last updated: Sep 07, 2020 10:22AM UTC

Hi, It does not look like you are doing anything wrong but the speed of the scan is dependent on a range of factors. This includes networking errors (e.g. latency, bandwidth, connection errors), the size of the application being scanned, the system resources available on your machine, etc... Is there a WAF that could be slowing the scanner down? Any rate-limiting on the server? You mentioned that the scan starts off fast but seems to slow down later. Which version of Burp are you using? Can you please email us on support@portswigger.net so we can assist you further?

You need to Log in to post a reply. Or register here, for free.