The Burp Suite User Forum was discontinued on the 1st November 2024.

Burp Suite User Forum

For support requests, go to the Support Center. To discuss with other Burp users, head to our Discord page.

SUPPORT CENTER DISCORD

Ignore 302's in "Discover content" tool

Liam | Last updated: Aug 25, 2015 06:02AM UTC

I *love* the Discover content tool, and use it a lot. Unfortunately, on several jobs I've run into the issue where the web server was configured to respond with 302 instead of 404 when a non-existent URL path was requested. For example, if a GET was issued for target.com/noSuchPage.html, the web server would respond with a 302 and redirect to something like target.com/Error.aspx?Path=/noSuchPage.html Naturally, this confuses the hell out of the content discovery engine, and it adds every request it makes to the site map. If there was an option in the Content discovery Config tab to "ignore 302 responses", that would be extremely useful in these cases Thanks!

PortSwigger Agent | Last updated: Aug 26, 2015 07:55AM UTC

Thanks for this report. The content discovery feature does attempt to fingerprint what is a "not-found" response, but we're aware of some cases like this where it is not quite working properly. We have a pending feature request to allow the user to configure their own fingerprints for "not-found" responses. We can't currently promise an ETA for this feature, sorry.

Burp User | Last updated: Dec 01, 2015 10:39PM UTC

Not an answer: Can I add a wishlist to ignore 403 forbidden errors? I am currently on a project where one directory of the server responds with a 403 for any and every request. The discover content feature gets all excited and queues a ton of requests. There are hundreds of requests queued for /doc/appl/documents/filename despite the fact that /doc/ and /doc/appl/ both responded with 403 forbidden errors. Every other directory in the short/long lists is either found or not found and functions as expected. Its just this one directory that returns the 403. There are three possible solutions to prevent this from happening. 1. Configurable knob for how to handle 403 errors during content discovery. 2. Ability to manually edit the short/long lists. This might be possible but I do not know how to do it. I posted a "how do I" request: https://support.portswigger.net/customer/en/portal/questions/15934705-edit-list-of-long-short-discovery-file-directory-lists?new=15934705 3. A directory blacklist feature so that discover content can be run and never come across these directories.

Burp User | Last updated: Dec 01, 2015 11:04PM UTC

Followup: Excluding /doc.* from scope does not work.

Burp User | Last updated: Dec 02, 2015 12:09AM UTC

Ignore my last comment about scope I did not have drop out of scope requests set.

PortSwigger Agent | Last updated: Feb 19, 2016 09:46AM UTC

There isn't currently any way to configure a custom fingerprint for "not found" responses, sorry. We are aware of this limitation, and do have a pending feature request to address it, but we can't currently promise an ETA.

Burp User | Last updated: Feb 22, 2016 03:07PM UTC

This is just to add that such a feature is needed, as I'm increasingly encountering sites where the current behaviour makes the content discovery engine unusable. Please make the "ignore list" feature configurable, at least WRT HTTP status code and response headers.

Burp User | Last updated: May 17, 2016 03:09PM UTC

Very much so needed --- 302 status codes for a single directory or web application firewall basically makes it so you can't use the tool for that site...please integrate in the next release would be very helpful

Burp User | Last updated: Jul 06, 2016 12:28PM UTC

I've come across a similar issue, site responds with 200, but page content is "sorry not found". Any way to re-write the 200 to 404 when the body content contains a certain string?

PortSwigger Agent | Last updated: Aug 24, 2016 08:27AM UTC

This feature hasn't been added yet, sorry.

Burp User | Last updated: Sep 07, 2016 07:02AM UTC

Has this feature been added yet? I am currently running into this issue. The problem with manual removing directories is that Burp's Discovery found every single request to be a valid path because the server responds with "302 Found" to almost any request. Unfortunately, this means the hundreds of directories and files found are not real and there is now good way to filter out all of the "non-real" files and directories. Thank you.

Burp User | Last updated: Sep 08, 2016 05:08PM UTC

I'm also eagerly awaiting this feature. I've been getting lots of false-positives because of 502 Proxy Errors when it brute-forces pages that don't exist.

PortSwigger Agent | Last updated: Sep 09, 2016 08:03AM UTC

Just to let you know that we've fixed this issue in today's update to Burp (1.7.15). The accuracy of automatic not-found detection has been considerably improved. Thanks again for your feedback.

Burp User | Last updated: Oct 03, 2016 08:27PM UTC

Add me to the list of those wanting to be able to configure based on response header / return code. It was unusable on a site that returned a 500 error for non-existent URLs. Even the ability to negative/positive match a regex run against the response header/body would suffice.

Burp User | Last updated: Dec 29, 2016 10:47AM UTC

The issue has not been fixed at all. it is still causing a problem. The suggested fix would be a way for a user to define their own custom string which can then be set to ignore. As it stands I am currently using: Engagement tools -> search to look for certain strings in responses then removing all results from the testing scope.

PortSwigger Agent | Last updated: Jan 03, 2017 09:27AM UTC

In our own testing on real-world sites, this is now approaching 100% accuracy. If you are still seeing cases where Content Discovery reports false positives, please email us at support@portswigger.net and we will investigate.

Burp User | Last updated: May 07, 2017 01:54AM UTC

I do not know where you accuracy readings come from but as of version 1.7.22 this is still an issue. You need to allow customization. It is an absolute pain how the discover content goes onto this rabbit holes, and keep queuing and queuing tasks. Is there any methodology in which you can ignore these requests. I have tried deleting these branches from the map but it just ignores the deleted entries and does it anyways. A suggestion would be to allow you to customise the queued task list, delete tasks etc.

PortSwigger Agent | Last updated: May 08, 2017 10:21AM UTC

We do have feature requests in our backlog to allow people to delete individual recursive tasks generated by the content discovery tool. If you are still seeing cases where Content Discovery reports false positives, please email us at support@portswigger.net and we will investigate.

Burp User | Last updated: Oct 26, 2018 07:51AM UTC

I can second that this needs a Failure pattern in some way or another. Even tools like the good old dirb or owasp dirbuster has this, needs to be fixed. Should be part of the Discovery options just before starting it. Using 2.0.09 beta and there is still not a way to filter out False positives.

Burp User | Last updated: Oct 11, 2019 12:19PM UTC

Also eagerly awaiting this option! Would appreciate any tool that does contain this feature, gobuster dirb and dirbuster aren't currently supporting this either.

Burp User | Last updated: Oct 11, 2019 01:05PM UTC