The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

Scan a predefined URLs list without crawling new URLs

Reo | Last updated: Feb 22, 2021 02:08PM UTC

Hello, so I have been trying to active scan an URL list. What I want to achieve is to make Burp scan every URL that I provide. The issue is, when I do an active scan on my URL list, Burp crawls every URL and I end up having a huge amount of URLs scanned -whereas I just want the URLs in the list, not more. I tried limiting the depth of crawl to zero to avoid this, but it doesn't seem to scan every URL that I provide. I also tried running wget requests through Burp's proxy to pupulate the site map, but again it ends up crawling every URL the same way. I also tried using some Burp extensions but without success. What I want to achieve is really simple : scanning my URL list without crawling other URLs. Is that possible ? And if yes, how should i proceed ? I'm using Burp 2021.2.1 Thank you in advance.

Uthman, PortSwigger Agent | Last updated: Feb 22, 2021 02:45PM UTC

Hi Reo, It sounds like you want the do active scan context menu option. Can you please navigate to the Target > Site map > select the requests you want to perform an active scan on (this only performs an 'Audit') > Right-click > Do active scan. Does that meet your requirements? When you mention that you "end up having a huge amount of URLs scanned", where are you viewing this? In the site map? Or by selecting View details > Audit items in your scan task?

Reo | Last updated: Feb 22, 2021 03:30PM UTC

Hello Uthman, Thank you for your quick reply ! Yes, I'm talking about the Target > Site map that contains more URLs than I need (I'm only talking about in-scope URLs. My question would be : how to populate this sitemap with only the URLs I provide ? For example, if I provide these URLs (by doing a wget while using Burp's proxy) : www.example.com/page1 www.example.com/page2 www.example.com/anotherCategory1 I would like to run an active scan only on these 3 URLs, but the site map is being filled with other URLs like www.example.com/pageIDontWantToScan from the same domain. I can't select each URL individually because my list is composed of hundreds of different URLs, that's why I'm a bit lost.

Uthman, PortSwigger Agent | Last updated: Feb 22, 2021 03:38PM UTC

Hi Reo, Thanks a lot for clarifying. If you launch Burp > Delete the two default tasks (Live passive crawl from Proxy and Live audit from Proxy), are you still seeing the URLs being crawled? Can you share the wget command you are using?

Reo | Last updated: Feb 22, 2021 05:33PM UTC

Hi again Uthman, Indeed, the issue was due to the Live passive crawl from Proxy and Live audit from Proxy tasks running in background. I disabled it and got it to work as I expected, with only the URLs I provide. Thank you for your kind help.

Uthman, PortSwigger Agent | Last updated: Feb 23, 2021 09:31AM UTC