You know how when you’re working on a site, you’ll have “Discourage search engines from indexing this site” checked in Settings > Reading?
If you try to crawl a site that’s being discouraged from indexing, you’ll likely only get one result from the scan and it’ll say it’s being blocked by robots.txt.
Well, it can be very helpful to be able to crawl the site for a full listing of all pages using Screaming Frog SEO.
To be able to crawl it, you gotta change some settings.
- Open Screaming Frog SEO Spider
- Go to Configuration > robots.txt > Settings
- Click the drop-down and select “Ignore robots.txt”
- Now crawl the site
Now you can make sure you’ve caught all of your 404s and whatever else you may need to check on a private instance of your website. 👌