The Log File Analyser is light, but extremely powerful - able to process, store and analyse millions of lines of log file event data in a smart database. It gathers key log file data to allow SEOs to make informed decisions. Some of the common uses include - Identify Crawled URLs. View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently. Discover Crawl Frequency: Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events. Find Broken Links & Errors: Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site. Audit Redirects: Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl. Improve Crawl Budget: Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency. Identify Large & Slow Pages: Review the average bytes downloaded & time taken to identify large pages or performance issues. Find Uncrawled & Orphan Pages: Import a list of URLs and match against log file data, to identify orphan or unknown pages or URLs which Googlebot hasn't crawled. Combine & Compare Any Data: Import and match any data with a 'URLs' column against log file data. So import crawls, directives, or external link data for advanced analysis.
What is new in this release:
- 1) Search Engine Bot Verification
- 2) Directories Tab With Aggregated Log Events
- 3) Directory Tree View
- 4) User-Agents, IPs & Referers Tabs
- 5) Amazon Elastic Load Balancing Support
- 6) Other Smaller Updates
What is new in version 1.8:
- Support for W3C date-local and time-local fields.
- Better detection of file types.