Burp Suite User Forum

Create new post

Scanned Urls are showing only two url Parent url and Parent url/robots.txt

Paresh | Last updated: Feb 09, 2021 07:36AM UTC

After scanning by Web app i can see only two url Parent url and Parent url/robots.txt in scanned urls . i have added parent url in the site url and none in the included urls. Do i need to add manually all the internal urls like Parent url/Home, Parent url/Show etc ? If the burp scan does all the internal scan automatically where i can see all the internal urls scanned details to be sure ? Regards, Paresh Badgujar

Ben, PortSwigger Agent | Last updated: Feb 09, 2021 09:45AM UTC

Hi Paresh, I have just responded to the email that you sent in on this matter. It might be easier to continue this discussion via the email thread.

David | Last updated: Mar 15, 2021 02:03AM UTC

Hi Guys! I am trialing the Enterprise version and I have teh exact same issue. I have also unchecked get robots.txt but still just get stuck on the beginning. Other tools I use and have tried just bypass this. How do I do that here? Please add me to this thread. Thanks /Dave

Ben, PortSwigger Agent | Last updated: Mar 15, 2021 03:17PM UTC

Hi Dave, I replied to the email that you also sent into us about this - it might be easier to continue troubleshooting your issue via email so we will await your response there.

Rui | Last updated: Sep 14, 2021 10:14AM UTC

I'm also facing this issue on one application, how can the coverage/crawling can be improved? I've checked the event log and there is an error event there saying "Communication error: host"

Ben, PortSwigger Agent | Last updated: Sep 14, 2021 10:30AM UTC

Hi Rui, To confirm, are you experiencing this issue with Burp Professional or Burp Enterprise?

Rui | Last updated: Sep 14, 2021 12:21PM UTC

Hi Ben, I'm experiencing this issue with Burp Suite Enterprise Edition v2021.8.1

Ben, PortSwigger Agent | Last updated: Sep 14, 2021 01:20PM UTC

Hi Rui, I can see that you have also sent us an email about this issue - in terms of information sharing it is probably going to be easier to troubleshoot this via the email thread that you have with my colleague (who has just replied to you). We will await your response there.

Rui | Last updated: Sep 14, 2021 03:25PM UTC

Hi Ben, the issue is solved now! Basically the first time that the app is accessed there is a redirect to an Identity provider, then another redirect, then a last one to the app again after getting cookies. The scanner does not follow those redirects apparently. I've added a recorded login sequence that just goes to that first redirect location and retrieves the needed cookies and now it is working and getting to all of the expected locations. Thank you anyway for the help. You email from support helped me understanding were the mistake could be.

Ben, PortSwigger Agent | Last updated: Sep 15, 2021 07:26AM UTC

Hi Rui, Glad to hear that you were able to solve this one!

Eugene | Last updated: Oct 18, 2021 08:05AM UTC

Hi Ben, We have the same issue here using Burp Suite Professional - Version 2021.8.4 (2021.8.4) Could you help please? Our staging site scans fine but on the live site the suite only craws / /robots.txt and /sitemap.xml

Eugene | Last updated: Oct 18, 2021 08:05AM UTC

Hi Ben, We have the same issue here using Burp Suite Professional - Version 2021.8.4 (2021.8.4) Could you help please? Our staging site scans fine but on the live site the suite only craws / /robots.txt and /sitemap.xml

Ben, PortSwigger Agent | Last updated: Oct 18, 2021 02:07PM UTC

Hi Eugene, Is the scan for your live site being reported as completing successfully (just with a lack of coverage) or is it encountering any errors? Is the bulk of the site content behind a login function and, if so, is the authentication handled differently on your live site compared to your staging site? Please feel free to send us an email at support@portswigger.net if you would prefer to share these details more privately.

Eugene | Last updated: Oct 19, 2021 08:36AM UTC

Hi Ben, Thanks for your response The scan completed successfully, although there were 3 errors thrown of the type 'Communication Error' . I can see in the 'logger' that the homepage has been returned with a 200 response and all html, so I am wondering why none of the links have been followed, we are using all default scan and crawl configurations. Our test site has basic http auth which I successfully added to the Project options but for the live scan I have removed this, and we are scanning just public pages. Is there anything I can try? Thanks

Ben, PortSwigger Agent | Last updated: Oct 20, 2021 07:09AM UTC

Hi Eugene, Are you able to email us at support@portswigger.net and include some screenshots of the details of the scan so that we have a better understanding of what has happened? If you click the 'View details' link for your scan on the Dashboard tab of Burp are we then able to get a screenshot of: - The 'Event log' tab (make sure to enable all filters here - the 'Debug' filter is usually off by default) - The 'Logger' tab (so that we can see the last requests that were successfully made to the site) - The 'Details' tab (so that we can see the request and location details) Are you also able to perform the following steps in order to setup both a headed crawl and to enable debug logging - the headed crawl will launch a browser during the scan process and visually show you what is happening (this might provide us with some insight into what is happening) and the debug crawl logs will provide us with more information with regards to what Burp is doing during the crawl phase of the scan. You can setup a headed crawl by creating a new crawl configuration (when you create a new scan, navigate to the Scan configuration screen and click the New -> Crawling button) and then enabling the Miscellaneous -> Show the crawl in a headed browser option. If you give the configuration a name and then save and apply it to your scan the headed crawl option should then be used in the scan. Similarly, you can enable the debug logs by going to your Crawl Scan configuration, then selecting "Crawl optimization > Cog button > Enable logging". You will also need to provide a suitable location of the log file to be written in by configuring the 'Directory' field here. Could you then send us the log file that will have been created, in the configured directory, when you have carried out your scan? In addition, does the headed crawl provide any visual indication of why the scan is not reaching the URLs that you are expecting it to reach? If you are having any issues with configuring any of the above then please let us know and we can provide some screenshots in the email that you send us.

Eugene | Last updated: Oct 22, 2021 02:59PM UTC

Thanks, with Ben's help it turned out that we needed to disable the 'request sitemap' checkbox in the Crawling configuration miscellaneous settings. With this change we saw the site coverage that we expected

Eugene | Last updated: Oct 22, 2021 02:59PM UTC

Thanks, with Ben's help it turned out that we needed to disable the 'request sitemap' checkbox in the Crawling configuration miscellaneous settings. With this change we saw the site coverage that we expected

Nagalakshmi | Last updated: May 30, 2023 08:28AM UTC

Hi Team, I use Burp suite enterprise edition. While running scans during crawling phase, I can see the current url scanning which is an authenticated page (eg: domain.com/a/authenticated_pages). But after successful crawling, I don't see those URLs in the Scanned URLs list and also there are no findings for those endpoints. The recorded login is working fine and the logs says 'Found 40+ new locations after logging in using recorded login'. But I don't see any of them in the results. I haven't added an in-scope or Out of scope URLs but the start URL (domain.com). I've emailed the support team twice but then no response. I'm expecting some help. Thanks.

Alex, PortSwigger Agent | Last updated: May 30, 2023 11:00AM UTC

Hi Nagalakshmi, I understand you are now in communication with one of my colleagues via a support case, so we shall assist you directly there. Best regards,

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.