The Burp Suite User Forum will be discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Centre. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTRE DISCORD

Create new post

Filtering URLs with specific words

Karthik | Last updated: May 28, 2016 02:28PM UTC

When a GET or POST request results in an error, the response URL will have the following wordings (and some other wordings depending upon the request made) "Could+not+create+url+for+page+path:+" (without " ") These wording can appear anywhere in the URL and doesnt have a fixed location. eg: http://www.domain.com/abc/page1/Could+not+create+url+for+page+path:+/xyz http://www.domain.com/abc/page2/Could+not+create+url+for+page+path:+/pqr http://www.domain.com/abc/123/dir1/page1/Could+not+create+url+for+page+path:+xyz/subdir1 http://www.domain.com/abc/564/dir3/page1/Could+not+create+url+for+page+path:+dir2/page1/xyz/subdir3 So, when the URL has those wording, I am trying to exclude this under scope for spidering so that I have only valid URLs. I tried various regex and it does not filter out the contents. I am still getting these pages under scope Is it possible and if so, could some one here help me out on how to exclude that under scope ?

PortSwigger Agent | Last updated: May 31, 2016 10:17AM UTC

If the expressions appear within the URL file path (which appears to be the case), then you should be able to add a suitable expression to the "exclude from scope" rules, to remove them from scope. Try first using a simple expression like the word "create" and get that working, then refine it with a more accurate expression if required.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.