Cases

Nov 25, 2019
You probably know log files are treasure troves. You can see how bots crawl your pages, which content is frequently visited, or vice-versa, ignored by bots. But often SEOs don’t know how to get the value of logs - it’s time-consuming to explore the sheer volume of data manually, and analytics tools may be costly. Still, you need to find a way to extract the data from your logs. Today we're going to demonstrate how log files analysis helps to reveal tag-related problems, indexability issues, and crawl budget waste.
Nov 11, 2019
Changing a site’s CMS rarely goes smoothly. Such issues as broken links, error HTTP responses and duplications often come out and cause SEO drops for months. The good news is that it’s still possible to move your site with minimum losses of traffic. In this case, we share a well-worked migration strategy of our client’s website. Take notes and memorize it word for word or don’t do that. Anyway, you will find actionable tips on how to minimize risks during site’s migration.
Oct 11, 2019
Hreflangs implementation can become complex when working with a high number of languages and countries. The best way to handle this task is to learn from the experience of similar websites. This case walks you through monitoring and troubleshooting hreflangs on enterprise-level platform with 1,5 mln pages and 2,5+ monthly visits.
Mar 20, 2019
SEO leader Ihor Bankovskiy has been working with JetOctopus for a year. In this article he shares the results of regular crawling & logs analysis on the global tutor platform Preply.com Insights on technical optimization, Ihor’s feedback about the crawler, and much more.
Nov 12, 2018

While most of SEOs share an understanding that submitting a sitemap to Google Search Console is important, they may not know the intricacies of how to implement them in a way that drives SEO KPIs. Here is how to find and fix problems in the website's structure and optimize a sitemap both for Googlebot and for users.

Nov 6, 2018
Sometimes you delete/change some pages and in results they become "orphaned" (URLs that aren't in the website structure but are still visited by bot). Crawl budget could be wasted on useless or outdated info instead of profitable content. Here is the case where bot wasted its resources on 3M useless pages. Case study of templatemonster.com
Oct 29, 2018
Medical portal with 700K pages and 2.7M monthly visits experienced 40% SEO decrease. We conducted a comprehensive technical audit to find the reason for the issue and developed a plan for technical optimization. Here are the most actionable insights.