Burp Suite User Forum

Create new post

some questions about using burp suite professional

Ilqar | Last updated: Feb 20, 2022 06:35PM UTC

Hello. I have some questions about using burp suite professional . My questions : 1)How to fix high memory usage of Burp suite professional ? It uses decent memory and computer freezes while Burp suite is working. I have configured JAVA with options -Xms100m -Xmx5g. But this doesnot helped. I have launched crawl and audit large web-site and after some time my computer is freezes . I saw that on task manager my physical memory usage is 92 % while Burp suite is working. My computer processor is i7-4790 and RAM is 8 Gb. I have increased heap size memory to 8 gb and this didnot helped too .So how i can configure JAVA or Burp suite to use memory partially with data writen to hard disk not fully memory usage because computer is freezes and i cannot do other work on computer ? 2)Why Burp suite didnot crawl some web-sites ? Crawl starts and finishes immediatly with no errors.My crawl strategy was most complete. I saw in target tab it crawled just 12 locations but web-site have more than 5000 links. I crawled these web-sites with OWASP ZAP successfully .All web-sites that Burp suite didnot crawled was successfully fully crawled by OWASP ZAP. 3) Can i import crawled urls by OWASP ZAP to Burp suite ? thanks for answers

Uthman, PortSwigger Agent | Last updated: Feb 21, 2022 10:57AM UTC

Hi,

Please see the answers to your questions below:

  1. Have you tried applying some of the optimization techniques described in our documentation?

By default, Burp will use half of the memory available on your system (if -Xmx is not manually specified) so specifying 100% of the memory available will not help.

Can you also share the information below?

  • Can you also describe the exact steps to replicate your issue?
  • Do you have multiple scans and Intruder attacks running at the same time?
  • Do you have any resource-intensive applications (e.g. Docker or a local web server) running at the same time as Burp?
  • Do you have any extensions enabled? If you disable them, do you notice any improvement in performance?

  1. Please email support@portswigger.net with the information below:
    • Comparison of the Burp vs ZAP output
    • Is the site publicly accessible? If so, can you provide us permission to scan it to investigate this further?
    • Run a new scan with the crawl debug mode enabled and share the log file. You can enable debug mode via Dashboard > New scan > Scan configuration > New > Crawling > Crawl Optimization > click the cog button > Enabling logging
  1. You can do this using the Import To Sitemap extension. Further information can be found below:

For your crawl issue, please share the information requested so that we can investigate this properly for you.



Ilqar | Last updated: Feb 21, 2022 01:57PM UTC

Hello. Thanks for answers. Regarding your questions. 1)I didnot have multiple scans and/or Intruder attacks running at the same time.I didnot have any resource-intensive applications running while Burp was working.I didnot have any extensions enabled.However i will try to run crawl and audit with no options of JAVA run.I will answer if high memory usage will again occur.But i doubt that high memory usage will not occur. 2)The web-site that Burp didnot crawled is publicly accessible.I have emailed to support@portswigger.net log file of this crawl.I cannot give you permissions because the site not mine.I just gave you an example of Burp not crawling some sites. 3)Thanks. I will use this extension. Can you advice me good web crawler for windows ? I cannot found good web crawler.Some of them cannot use https protocol,some of them is limited to number of urls,some of them give errors,some of them crashes. I cannot use OWASP ZAP crawler too because OWASP ZAP uses high memory too.

Uthman, PortSwigger Agent | Last updated: Feb 21, 2022 02:15PM UTC

Hi Ilqar,

I have replied to your query via email.

For 1, please share the information (screen recording of the issue replicated and diagnostics) requested so that we can investigate this properly for you.

In the case of the site you are scanning, I think disabling the 'Request robots.txt' may help since the debug log file shows it repeatedly hitting only that endpoint (/robots.txt).

In terms of your third point, Burp Scanner does a great job at crawling. This is further enhanced by our implementation of browser-powered scanning.

It usually takes running a few scans with different scan configurations before you find out which suits your site/application best.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.