The Screaming Frog SEO Log File Analyser allows you to upload your log files, identify crawled URLs and analyse search bot data and behaviour for invaluable SEO insight.
The Screaming Frog Log File Analyser allows you to quickly upload a log file and analyse it from an SEO perspective. Log file data is extremely valuable as it shows exactly what has happened when a search engine bot visits your website. However, log files can be extremely large in size and difficult to analyse, without programming experience or the right tool. The Log File Analyser is specifically designed for SEOs to make this process less painful.
The Log File Analyser supports Apache and W3C Extended Log File Format which covers Apache, IIS and NGINX servers. Simply drag and drop your log file, or log files directly onto the user interface and the tool will automatically process the data to allow you to analyse and filter for common SEO issues in the program’s user interface, or export to CSV and Excel.
- Identify Crawled URLs : View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently.
- Discover Crawl Frequency : Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events.
- Find Broken Links & Errors : Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site.
- Audit Redirects : Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl.
- Improve Crawl Budget : Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency.
- Identify Large & Slow Pages : Review the average bytes downloaded & time taken to identify large pages or performance issues.
- Find Uncrawled & Orphan Pages : Import a list of URLs and match against log file data, to identify orphan or unknown pages or URLs which Googlebot hasn’t crawled.
- Combine & Compare Any Data : Import and match any data with a ‘URLs’ column against log file data. So import crawls, directives, or external link data for advanced analysis.