Burp Suite User Forum

Login to post

Active Scan issues in BURP REST API

ARPIT | Last updated: Oct 25, 2022 02:30PM UTC

Hi Team, I tried scanning a web application using Burp. I have used: Audit coverage - thorough Configuration. While entering the scope URLs, in the Advanced Scope category, I have put some URLs in the "include URL Prefixes", which were not crawled during the scanning. Can you please help me out. Thanks, Arpit

Liam, PortSwigger Agent | Last updated: Oct 26, 2022 07:12AM UTC

Hi Arpit. Thanks for your message. Do you see any errors in the Event log? Are the ignored URLs REST API related?

ARPIT | Last updated: Nov 04, 2022 11:01AM UTC

Thanks for your response. The ignored URLs are not REST API Related. However, the issue still persists, that is, the scanner is not crawling all the pages which are present in the Site Map. Can you suggest any changes in the configuration settings, to crawl all the pages of Sitemap. Thanks, Arpit

Liam, PortSwigger Agent | Last updated: Nov 04, 2022 02:03PM UTC

Is the application publicly accessible? If so, could you provide us with access?

ARPIT | Last updated: Nov 06, 2022 08:40AM UTC

The application is not publicly accessible.

Liam, PortSwigger Agent | Last updated: Nov 07, 2022 08:56AM UTC

Hi Arpit. Is there any further information you can provide us regarding the application? If so, could you email us at support@portswigger.net, please?

ARPIT | Last updated: Nov 07, 2022 05:13PM UTC

I am scanning the application using BURP REST API. While crawling the application, the crawler visits few pages continuously, but when the scan is completed, we do not see those pages in the "Audit Items". Can you suggest any more changes? Meanwhile, I will share the details over email as well. Thanks, Arpit

Liam, PortSwigger Agent | Last updated: Nov 08, 2022 09:30AM UTC

Thanks, Arpit. If you can attach screenshots demonstrating the issue to the email, that would be much appreciated.

ARPIT | Last updated: Nov 23, 2022 06:30AM UTC

Hi Liam, It is not possible to share the screenshots since it is in the client network. Let me know if there is another way of connecting so that I can demonstrate multiple issues which I am getting. Also, this issue is in multiple applications. Description: In the event log tab, it says, found 71 new locations after logging in. In the details tab, it says, Unique locations: 72 In the Audit items tab, it says, total items: 268 --> In which I could see multiple URLs repeated 7-10 or more times as well. (The above numbers are changing in every scan drastically, based on the custom configurations) While it was crawling it said: Currently crawling: www.example.com/a/b/parties But when it completes the Crawling, it doesn't show the parties URL which is mentioned above. & this happens with a lot of URLs. Configurations: Crawling: Maximum Link Depth: 10 Crawl Strategy: Most Complete Max Crawl time: 60 minutes Max Unique Locations Discovered: 100 Auditing: Audit Speed: Thorough Audit Accuracy: Normal Max Insertion points per request: 10 While auditing, if we go to the audit items tab, in the "Active Phases" Column, the phase 1 is in red & it shows error making my application locked out. I have raised the same issue on the following URL: https://forum.portswigger.net/thread/error-skipping-current-scanner-checks-bc7080be

Liam, PortSwigger Agent | Last updated: Nov 23, 2022 11:21AM UTC

Hi Arpit The numbers you are seeing don't necessarily mean that there is a problem. Found 71 new locations after logging in + the login URL location = 72. The Audit items are not URLs or locations. The following details about each item are shown: An index number for the item. The destination protocol, host, and URL. The current status of the item. The progress completed for the various phases of the audit, as applicable (passive, active, and JavaScript analysis). The number of issues identified for the item, categorized by severity. The number of requests made while auditing the item. Note that this is not a linear function of the number of insertion points - observed application behavior feeds back into subsequent requests, just as it would for a human tester. The number of network errors encountered. The number of insertion points created for the item. - https://portswigger.net/burp/documentation/desktop/scanning/audit-items

Liam, PortSwigger Agent | Last updated: Nov 23, 2022 11:23AM UTC

Regarding your issue with Phase 1/recorded login errors, if that is not rectified in the other support thread, please let us know.

You need to Log in to post a reply. Or register here, for free.