Burp Suite User Forum

Create new post

v2.02 Beta version - Unable to Crawl and Audit

Pedro | Last updated: May 07, 2019 09:46PM UTC

Hello, I'm testing Burp Suite for my company and I got the older version working flawlessly. I could crawl and passive/active scan without issues. My boss was curious about the new features of the beta version. So I upgraded to it and I'm running into problems. For Target Scope I have included a url path: https://test01-www.test.com I have the certificate installed (from the previous version, not sure if that matters) and proxy settings matching 127.0.0.1:8080 on my browser. When I goto the site https://test01-www.test.com the default "Live passive crawl from proxy (all traffic)" starts capturing data. But when I try a "new scan" doing a "crawl and audit" using built in scan configurations from the library it doesn't give me any results. It attempts for 5 minutes then finishes with no results. Doesn't crawl or scan. I've tried a few different default settings "Audit coverage - Maximum" "Audit checks - passive" "Crawl strategy - more complete" and none of them work. Any thoughts on what is going on? We have a pretty big site and I want to start a scan and let it run throughout the site. Thank you

Rose, PortSwigger Agent | Last updated: May 08, 2019 09:31AM UTC

Is the site you are trying to scan JavaScript heavy? At the moment, the crawler cannot crawl JS, but this is a feature that is being worked on currently. This behaviour was not present in Burp 1.x either, so can you confirm for me whether you were able to spider and scan this particular site in Burp 1.x? Could you send us an email (support@portswigger.net) with your crawler log file: Run the a crawl on the site using Burp Pro 2. You can enable debug mode by creating a crawler configuration, and within crawl optimization, clicking the cog button and enabling logging.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.