The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

Parallel scanning(Crawl & Audit)

IT | Last updated: Nov 30, 2020 07:20AM UTC

Currently we have Burp Suite Professional license and as of now we are doing sequential scanning which actually consumes lot of time (in days). Considering the short time we have, we would like to fasten the schedule of the scanning and get it completed in short period. Though I could create 2 new scans, would like to understand the impacts(if any) of doing so. For example, if I create 2 new scans(as like explained below) and do ‘Crawl & Audit’, would there be any impact/challenges on my security test and reporting: 1. How feasible is that from System resource and Tool perspective 2. What will the impact on generating the reports in grouping the vulnerability issues by the pages (Group Login related issues as a separate report and Search related issues in another report) 3. What will the impact on the security test. Will the report be same if I did the scanning in sequential and parallel? Is there any chance of issue being missed? 4. Is there any critical thing to consider when scanning requests in parallel My Requirement is to do active scanning for two different pages/scenarios 1. Search product 2. Check-out product Thanks.

Uthman, PortSwigger Agent | Last updated: Nov 30, 2020 10:37AM UTC

Hi, This really sounds like a perfect use-case for Burp Enterprise. Have you considered completing a free 30-day trial of that? - https://portswigger.net/burp/enterprise/trial In terms of your questions, please see below: 1. If these are launched in two separate instances of Burp, the resources will be shared between each one. For example, half of the RAM available on your machine will be allocated to each instance. I would suggest adjusting the values in the VMOPTIONS file or when launching Burp from the command-line (https://portswigger.net/burp/documentation/desktop/getting-started/launching/command-line) 2. You can find an example report here of how issues are reported: https://portswigger.net/burp/samplereport/burpscannersamplereport. The scanner can detect the issues listed here: https://portswigger.net/kb/issues. Alternatively, you can use an extension (Burp Bounty, Scan Check Builder) on the BApp Store to create your own custom scanner checks and report them: https://portswigger.net/bappstore/618f0b2489564607825e93eeed8b9e0a 3. There should be no difference in the reports if the scan completes. If you are scanning separate sites, the issues are reported for each one. You can find out more information on reporting here: https://portswigger.net/burp/documentation/desktop/scanning/reporting-results 4. Please make sure that your machine is adequately resourced. Ideally, you will want to use a dedicated machine if you are trying to run multiple instances of Burp. However, our Enterprise product achieves this and will save you a lot of time. In terms of your requirements for active scanning, the type of application/site does not generally make a difference. The scanner may have issues with single-page applications but we are actively working on improvements for this (https://portswigger.net/blog/burp-suite-roadmap-update-july-2020)

IT | Last updated: Nov 30, 2020 12:19PM UTC

Thanks for your response. But I think we are still lacking clarity here. So let me rephrase our queries - 1. Would you suggest doing parallel scanning(crawl & audit) against the same web application but on different modules/functionalities? For instance first module could be searching a product & the other module could be 'Check-out' 2. If we do that way, would one scan impacts the other scan? 3. Does the number of parallel scan threads depends on the resource of the machine or is there any thread limit?

Uthman, PortSwigger Agent | Last updated: Nov 30, 2020 12:52PM UTC