Log files analysis shows what Google bot actually does on your site. It reveals real problems instead of hypothesis.
Watch this short video to learn, what you’ll get together with log files.
In case our recommendations don’t help - you’ll get your money back.

Glenn Gabe, Digital marketing expert

Log files analysis is really great for effective technical web-site optimization as well as for simple security. On the one hand, it helps you to find those problems Google bot has found and quickly improve the situation. On the other hand through your log files you can detect cyberattacks on your web-site or fake user agent disguised by your competitors as Google bot in order to scan your site and find out some information, and other attempts to harm your server.
 

What is Log files?

Log files contain source data on everything what is going on on the web-site, including users and crawlers’ visits. When analyzing your log files you can get more detailed information on each separate visit and each action in order to see what pages were crawled, what server code was given, referrers, IP addresses and much more. Actually most web-site owners have absolutely no idea who or what crawls their sites until they take a look at log files.
Clients' feedback on Logs analysis
Ihor Bankovsky, preply.com (2.7 mln. visits monthly) SEO leader
In Preply we just started to work with JetOctopus. Log analyzing service impressed us a lot. Now we can combine data from crawler and our server logs. This cool reports helping us to find pages that available for indexation but not crawled by Googlebot. In result we modified our interlinking and grow number of pages in index and traffic on them.
Stanislav Dashevsky, bookimed.com (300k monthly visits) SEO leader
Server log files are no simple report, but a source of insights for top level SEO professionals. Log files analysis gives a chance to take a look at a site through the eyes of a robot, which means one can control indexation, crawling budget and accurately evaluate interlinking efficiency. Log files analytics is a ‘must-have’ for really big sites.
Igor Shulezhko, Templatemonster (more than 3 mln. visits monthly) SEO manager
I can’t tell how glad I am we’ve tried jetoctopus log files analysis service. Right now we’re busy changing CMS and we needed to crawl the site before and after (on a test domain). And the best part is that we’ve found so many insights we had no idea of ever before. At the moment we’re implementing several changes that are to increase the traffic. We’re also making some adjustments to the new CMS.
Artur Mikhno, Work.ua (more than 16 mln.visits monthly) Co-founder

We would like to point out the JetOctopus’ special approach to work. You can feel that team members are big fans of their project. The main purpose of JetOctopus crawler is to help clients achieve the best results in technical SEO. It’s really cool. This kind of cooperation always gives you energy and inspires trust!

For webmasters of successful Job Search Website Work.ua it’s crucial to know how Googlebot crawls and indexes the website. Work.ua is being constantly developed and now there are 3+mln webpages on the website. At some point we understood that it was impossible to realise on-page SEO optimization manually. We were searching for reliable tool that could help us analyze and improve our website. Finally, we found the solution - JetOctopus crawler!

Crawling report gave us valuable insights. Work.ua hasn’t got problems with indexation, and website’s structure is optimized well for search engines. There aren’t any critical technical bugs on our website, only a few warnings. But we received important recommendations for improving our website. Another positive finding is that the overall technical state of our website is OK, and we can focus on other goals and objectives. It’s really important insights for us.

Dmitry Zhurakovsky, likar.info (more than 2.8 mln. visits monthly) SEO manager

Up to now JetOctopus is the best crawler I have ever tested. Crawler finds so many pages that it is impossible to compare with any desktop crawler. And the usability of dashboard (problems visualization, segments, 150 filters to work with data and export in many possible problems lists) makes technical audit even more comfortable. Log files analysis by JetOctopus team helped us to find thin pages at any level of our big website. It let us make the correct roadmap of technical optimizations for predictable indexability increase.

  • Log files analysis lets see the site through the eyes of a robot
  • It’s not just a report, but a source of insights for SEO experts
  • A tool to control indexation and crawling budget
  • Accurate evaluation of SEO optimizations efficiency
  • Your security in case you decide to relocate your web-site
  • And much more.
The best offer on logs?
300 euro/1 mln.pages to crawl and logs analysis without limits.

You will never find unlimited log files analysis on the market. Why we are doing it? Because having the logs for longer period gives us opportunity to see the big picture and give you much more understanding of your site indexation problems. Also focus on customer is our moto.

What’s inside?
Logs files analysis

We download  all the logs you provide us with. The longer the period of logs is the better. The price doesn't change if it is logs for 1 month or for 6 months.  Store your logs and come to us when you have at least 1 month logs. After you provided the logs it takes not more than 2 weeks to receive a detailed report back.

It is our team of analytics who is analyzing your logs! 

Crawling the whole website

We make full website crawling to find main technical errors at a website. Also crawling report gives the answers to the problems found in logs. Why these particular pages are not visited by bots? If DFI is ok, or if all the indexation parameters are correct etc.

The speed of crawling is up to 200 pages per sec. The fastest on the market. 

Logs+Crawling+GSC overlapping

When we have logs and crawling reports our team of analytics overlap this data with Google Search Console. Here we are able to pick up the TOP ranked pages and make small actions to boost them to TOP 5. To boost not-visited by bots pages we pick up the list of well-indexed and rated pages and put links from them. 

You receive the report with the main problems of indexation and What to do recommendations.

And our team is always next to you. We are having phone calls with your technical and marketing team as many times as you will need until you understand what are the problems and how to improve the situation.


Recording of Webinar about Logs analysis is here

Log Files Analysis
Don’t worry. If our work turns out to be inefficient for you – you’ll get your money back
30 days for refunding. If our final report doesn’t contain useful SEO insights, we’ll return your money.