Logs Analysis For SEO Boost

Oct 22, 2018
  • Please, share!

Logs Analysis For SEO Boost

Logs are the unique data that are 100% accurate to fully understanding how Googlebot crawls the website. Deep logs analysis can help to boost indexability and ranking, get valuable traffic, improve conversions and sales.

In this article we will answer the following questions:

  1. What are logs and log analysis?
  2. How crawling budget could be optimized with the help of log analysis?
  3. How website’s indexability could be improved with the help of logs?
  4. Why is it important to scan logs while website migration?

Logs & log analysis

Log (log file) is a text data file, which contains systematic information about Google search bots visits. In other words, log is a computer file with record of actions that have been done on a website.

Logs analysis involves processing of archival log files after at least a month of website work but ideally after half a year. Because some time it is needed to get the complete picture of search bot actions. Logs show you how often Googlebot visits your site, which web pages are indexed and which web pages are not. At the first glance, these data may seem unimportant, but if you analyse this information at least during a few months, you can estimate trends in your business comprehensively. Logs are really juicy data you should use for on-page optimization. Although Google search bot algorithms change constantly, log files analysis helps you to see long-term trends.

Log analysis for crawling budget optimization

Today there are a lot of definitions for ‘crawling budget’ and it could be hard to find the right meaning. Google webmaster Central Blog has published the article "What Crawl Budget Means for Googlebot" in which author clarifies this term. In simple words, crawling budget is the limited resource Googlebot is ready to spend on each website. In fact, Googlebot can waste crawling budget on irrelevant webpages. For instance, you have a crawling budget of 5000 web pages per day.

In this case you want these 5000 webpages to be shown in organic search results, but instead Googlebot crawls 1000 irrelevant webpages. If crawling budget was wasted on useless urls, relevant content won’t be indexed. That way log analysis helps you to understand where crawling budget is spending.

Analyze logs to improve website indexability

Let's define the term 'indexing'. In the SEO world it's a process of adding web pages into Google search engine database. After search bot crawls a website and enters data to the index, it ranks your site in the organic search results. In Google Search Console you can find valuable crawl stats:

Nevertheless, GSC won’t give you info HOW search bot scans your site step-by-step. If you want to know for sure which pages were indexed, and which are not - try log analysis. Without indexing, your website has no chances of ranking.

Indexability сan vary for different types of websites, but if more than 80% are not indexable, you should check it. Otherwise, crawling budget can be wasted.

Log analysis helps to reduce the web migration's negative impact

If your site migrates to a new CMS, log file analysis is the most crucial to avoid SEO drops. Unfortunately, website migration never goes without epic bugs. Log files help you to get rid of errors quickly before Google index them after website migration.

Of course, log analyzer is a crucial helper of every webmaster, but you will achieve real SEO boost if you combine log analysis with crawling. Web crawler is a tool that acts as a search bot. Crawler finds technical bugs, that should be removed for a successful migration. In other words, log analysis shows which pages aren’t indexed, crawling shows you why these web resources aren’t indexed. So, you fix bugs due to crawling data and check how your improvements work in logs.

To sum up:

  1. Logs analysis helps to ‘see’ your site like Googlebot
  2. Logs helps to optimize the crawling budget
  3. With the help of logs website’s indexing could be improved
  4. Log analysis helps to reduce the web migration's negative impact
  5. Combination of log analysis and crawling gives the full picture of your site problems
Free webinar on Logs will be 30th of October. Places are limited. Register now

Get more useful info Linking Explorer – a database on interlinking for each page


Ann Yaroshenko is a Content Marketing Strategist at JetOctopus. She has Master’s diploma in publishing and editing and Master’s diploma in philology. Ann has two years experience in Human Resources Management. Ann is a part of JetOctopus team since 2018.

  • Please, share!
Auto classified, 20m pages crawled
What problem was your SEO department working at when you decided to try our crawler?
We needed to detect all possible errors in no time because Google Search Console shows only 1000 results per day.

What problems did you find
That’s quite a broad question. We managed to detect old, unsupported pages and errors related to them. We also found a large number of duplicated pages and pages with 404 response code.

How quickly did you implement the required changes after the crawl?
We are still implementing them because the website is large and there are lots of errors on it. There are currently 4 teams working on the website. In view of this fact we have to assign each particular error to each particular team and draw up individual statements of work.

And what were they?
It’s quite difficult to measure results right now because we constantly work on the website and make changes. But a higher scan frequency by bots would mean the changes are productive. However, around one and a half months ago we enabled to index all the paginated pages and this has already affected our statistics.

Having seen the crawl report, what was the most surprising thing you found? (Were there errors you’d never thought you’d find?)
I was surprised to find so many old, unsupported pages which are out of the website’s structure. There were also a large number of 404 pages. We are really glad we’ve managed to get a breakdown of the website subdirectories. Thus we’ve made a decision which team we will start working with in the beginning.

You have worked with different crawlers. Can you compare JetOctopus with the others and assess it?
Every crawler looks for errors and finds them. The main point is the balance between the scanned pages and the price. JetOctopus is one of the most affordable crawlers.

Would you recommend JetOctopus to your friends?
We’re going to use it within our company from time to time. I would recommend the crawler to my friends if they were SEO optimizers.

Your suggestions for JetOctopus.
To refine the web version ASAP. There are a few things we were missing badly:
Thank you very much for such a detailed analysis. Currently we have been reflecting on a redirects' problem.
Do you wan’t
more SEO traffic?
All sites have technical errors which block your SEO traffic boost
I’ts an axiom
Find out the most epic errors ot your site and start fixing them
The amount of the provided information is quite enough and you can see at once where and what the problems are