The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

SQL injections after Burp Scan

Dionisis | Last updated: Oct 20, 2020 09:26AM UTC

Hello everyone, I have the Burp Suite Professional v2020.9.1 I scan the same target with the same software twice. The first time that I scan the target, the results that the tool produces are with SQL Injections as opposed to the second time that i scan again the same target with the same software and i doesn't display the same results or displays differently. How is this justified? How can I be sure of the results when I have different results from scan to scan? Thank you in advance

Ben, PortSwigger Agent | Last updated: Oct 20, 2020 11:08AM UTC

Hi, To confirm, are you repeating the scans using the same project file or using different project files for each scan? If you are using the same project file to conduct further scans on the same website then Burp will intentionally not display the same issues again as they have already been recorded. You would need to create a new project file for each repeat scan of the same site.

Dionisis | Last updated: Oct 20, 2020 11:45AM UTC

Could you clarify that? Be more analytical please. If I scan a second time as confirmation of the previous results, can I get the same results or each time will be different ?

Ben, PortSwigger Agent | Last updated: Oct 20, 2020 02:25PM UTC

Hi, If you scan the same site again, using the same project file, then vulnerabilities that were captured in your first scan will not be highlighted in the results of the future scans, even if they still exist. This is intentional behavior - Burp records vulnerabilities on a per host basis not a per scan basis. The initial vulnerabilities will have been captured and recorded under that particular host in the Target -> Sitemap as a result of the first scan. If you wish for all vulnerabilities to be shown for each subsequent scan of a site then you would need to create a fresh project file for each scan.

Dionisis | Last updated: Oct 21, 2020 10:27AM UTC

Hi, Thanks a lot for he prompt answers. So if I wish for all vulnerabilities to be shown for each subsequent scan of a site then I would need to create a fresh project file for each scan. This means that every time I have to close the burp tool and reopen it and make a crowl and after that an audit scan ? All the steps from the beginning ?

Dionisis | Last updated: Oct 21, 2020 12:43PM UTC

Also I would like to ask the following: 1. I scan a specific host and I do not have any issue / vulnerability. I scan the same host again and get as a result 2 SQL Injections. How do you explain that? 2. I have a project and I have some issues in the results. If I manually delete the issues (Target - Site map - Issues) and scan the same host again in the same project, will it find the same issues ? 3. If the issue has been solved, I have to create new project in order to scan the host again and see if this problem has been solved ? 4. Is there any specific configuration in crawling, so that it finds on its own all the subpaths of the host (like edit path), and doesn't remain in the initial path Thank you in advance for the answers

Ben, PortSwigger Agent | Last updated: Oct 22, 2020 01:11PM UTC

Hi, In answer to your queries: 1. Has the code behind the web application changed between scans? Are you using the same scan configuration for both scans? Has anything else changed? All conditions being the same, each scan should find the same vulnerabilities. 2. Deleting the issues in the Sitemap (or indeed deleting the host in its entirety) should have the same affect - the previously discovered vulnerabilities are no longer recorded so they will be be seen as new issues and added. My preference has always been to create a new project file so that i have a clean solution but i can certainly see why deleting the previously discovered issues might be preferable. 3. See the answer above. Each scan in Burp checks against all previously discovered issues before it adds a found issue to the issues list and the site map. This is standard Burp behavior and is done to improve the signal to noise ratio. Subsequent scans only report new issues or upgrade existing ones. It is worth pointing out that Burp Suite Professional is not designed with full automation in mind (i.e. it is not designed for repeated, regular scanning of web assets to track vulnerabilities over time. If you are looking for automated scanning of target applications, we would recommend that you check out our Enterprise edition, which does have the functionality that you are describing. 4. Can you clarify what you mean by this?

Dionisis | Last updated: Oct 27, 2020 08:52AM UTC

Hi, 1. I am using the same scan configuration and the code behind the web is the same without changes. I can't understand why in the first scan I hadn't vulnerabilities, while in the second scan I perform the results are with vulnerabilities. 4. I mean that when I scan for first time a host with burp, I perform crawling scan in order to find all the paths of the host. As you can see in the link below, burp has found the initial paths and has not penetrate to the subpaths. Subpaths are gray. Is there any specific configuration in crawling, so that it finds on its own all the subpaths of the host like ne_info - edit path, so that the user doesn't do them manually? https://imgur.com/a/W3ahyWX

Ben, PortSwigger Agent | Last updated: Oct 28, 2020 10:56AM UTC

Hi, Have both scans completed successfully and, if so, have they both covered the same URLs during the scanning process? When items in the Sitemap appear in gray it usually means that Burp has discovered links to it in the content that has been requested but has not yet sent requests to that discovered content itself. How are you carrying out your initial crawl - via the New Scan button on the Dashboard or some other method?

Dionisis | Last updated: Oct 29, 2020 10:24AM UTC

Hi, Yes, both scans have been completed successfully. I haven't checked if they've covered the same URLs during the scanning process. I will check it out! I am carrying out the initial crawl - via the Target- Site map - right click - Scan - Open scan launcher - etc

Dionisis | Last updated: Oct 29, 2020 11:11AM UTC

Also, If the items don't appear on the site map and must be done manually, is there any setting that I can do, so burp can find all the items automatically after crawl scan ?

Liam, PortSwigger Agent | Last updated: Oct 30, 2020 11:56AM UTC

Hi Dionisis. When the items don't appear on the site map, are these items that have appeared on previous scans? Or is Burp consistently failing to find these items?

Dionisis | Last updated: Nov 04, 2020 02:57PM UTC

Hi, Other times they may appear and others don't. But most of the time I have to click on the object from the browser to appear it at the site map

Liam, PortSwigger Agent | Last updated: Nov 05, 2020 05:09PM UTC

This sounds strange. Is this application publicly accessible? If so, could we obtain permission to perform testing?

Dionisis | Last updated: Nov 13, 2020 11:02AM UTC

It's a little beat difficult. Could you please tell me how if an error occurs in the first scan, how could I catch again the same error with a second scan ? If the items don't appear on the site map and must be done manually, is there any setting that I can do, so burp can find all the items automatically after crawl scan ? Could you please tell me the steps from the first scan ?

Liam, PortSwigger Agent | Last updated: Nov 13, 2020 03:33PM UTC