Let’s talk about SEO. But not only about it

Nov 27, 2019
When search bot crawls your website and finds similar data on multiple URLs, it doesn’t know how to treat your content. In most cases, bot trusts the clues you give it (unless you trying to manipulate search results). So, the game plan is to specify which pages are original and which are appreciably similar. Let's find the best way to do it.
Nov 25, 2019
You probably know log files are treasure troves. You can see how bots crawl your pages, which content is frequently visited, or vice-versa, ignored by bots. But often SEOs don’t know how to get the value of logs - it’s time-consuming to explore the sheer volume of data manually, and analytics tools may be costly. Still, you need to find a way to extract the data from your logs. Today we're going to demonstrate how log files analysis helps to reveal tag-related problems, indexability issues, and crawl budget waste.
Jan 30, 2019
Have you ever faced SEO drop? So did we. Guided by data of partial crawl, our SEO had deleted relevant content (30 % of the site!) which led to the dramatic traffic drop. The lesson we learned is that partial technical audit is useless and even harmful. Why? Read below to avoid repeating our mistakes.
Nov 11, 2019
Changing a site’s CMS rarely goes smoothly. Such issues as broken links, error HTTP responses and duplications often come out and cause SEO drops for months. The good news is that it’s still possible to move your site with minimum losses of traffic. In this case, we share a well-worked migration strategy of our client’s website. Take notes and memorize it word for word or don’t do that. Anyway, you will find actionable tips on how to minimize risks during site’s migration.
Oct 22, 2019
According to research, 41% of adults and 55% of teens use voice search daily. The volume of voice search queries is growing, but how does that affect ranking algorithms? What priority should have voice SEO in 2020? What would you do now to gain the upper hand over competitors in the future era of voice search? Let’s find out.
Oct 11, 2019
Hreflangs implementation can become complex when working with a high number of languages and countries. The best way to handle this task is to learn from the experience of similar websites. This case walks you through monitoring and troubleshooting hreflangs on enterprise-level platform with 1,5 mln pages and 2,5+ monthly visits.
Sep 30, 2019

Logs files register each request the web server receives. If you want to get SEO insights from logs, you'll need to deal with heavy-data processing. That can only be done by an enterprise-grade logs analyzer. Jetoctopus precisely shows how any bot is crawling your pages, so you can optimize your site to increase organic visibility. Sounds great, right? Wait till we’ll show you how to get the most out of log analyzer on example of real website.

Aug 8, 2019

Google has included site speed (and as a result, page speed) as a signal in search ranking algorithms. Speeding up websites is crucial — not just to website owners, but to all Internet users. JetOctopus team encourages you to start looking at your webpages speed — not only to improve your ranking in Google, but also to improve user's experience and increase profitability.

Jun 20, 2019

Yes, Robots.txt can be ignored by bots and yes, it's not secure: everyone could see the content of this file. Nevertheless, well-considered robots.txt helps to deliver your content to bots and omit low-priority pages in SERP. At first glance, giving directories is an easy task but any kind of management requires your attention. Let’s go through common Robots.txt mistakes and ways to avoid them

May 13, 2019
Knowing how to move from small-scale content production to managing thousands of product descriptions, internal linking strategies, and multi-channel marketing can seem like a daunting task. So what is the best course of action for companies that want to grow online? Let's explore some strategies that are proven ways to ease the stress of scaling up content production.

Mar 27, 2019

Recently Google Webmaster Central Office announced a series of SEO updates with a hashtag #springiscomning in Twitter. News about the new way Google will treat pagination affected SEO community greatly. Let the dust settle and get to the bottom of this news.

Mar 20, 2019
SEO leader Ihor Bankovskiy has been working with JetOctopus for a year. In this article he shares the results of regular crawling & logs analysis on the global tutor platform Preply.com Insights on technical optimization, Ihor’s feedback about the crawler, and much more.
Mar 12, 2019

Canonical tags were created to fix duplicate content issues. In a nutshell, if you have 3 duplicate pages (or approximately similar ones) you should pick just one of them to be shown in Google. Canonicalization is a way to help Googlebot decide what page to show in the SERP. However, rel=canonical tags don’t help the search engines unless you use them properly. This tutorial walks you through how you can use JetOctopus to audit rel=canonical tags quickly and efficiently across a website.

Dec 13, 2018

JetOctopus new product update is out! You can now compare two different crawl reports and see crawl evolutions! With our unique module, you will be able not only to watch SEO dynamics in overall but also to compare changes in segments of your website. This approach helps to identify the strengths and the weaknesses of each segment on your website and replicate successful experiments on the whole website. Let’s focus on real benefits you can take from our new module to maximize your ROI.

Nov 12, 2018

While most of SEOs share an understanding that submitting a sitemap to Google Search Console is important, they may not know the intricacies of how to implement them in a way that drives SEO KPIs. Here is how to find and fix problems in the website's structure and optimize a sitemap both for Googlebot and for users.

Nov 6, 2018
Sometimes you delete/change some pages and in results they become "orphaned" (URLs that aren't in the website structure but are still visited by bot). Crawl budget could be wasted on useless or outdated info instead of profitable content. Here is the case where bot wasted its resources on 3M useless pages. Case study of templatemonster.com
Oct 29, 2018
Medical portal with 700K pages and 2.7M monthly visits experienced 40% SEO decrease. We conducted a comprehensive technical audit to find the reason for the issue and developed a plan for technical optimization. Here are the most actionable insights.
Oct 29, 2018
Recently we’ve read the sad post on Facebook about sudden hardware malfunction. There was no one to help, and cunning competitors used submit URL tools and as a result, the website was overthrown from the TOP organic search positions. The conclusion: the website needs to be monitored more thoroughly and the bugs fixed without delay. Today we want to present the other way to address such problems.
Oct 22, 2018
Logs are the unique data that are 100% accurate to fully understanding how Googlebot crawls the website. Deep logs analysis can help to boost indexability and ranking, get valuable traffic, improve conversions and sales. Although Google search bot algorithms change constantly, log files analysis helps you to see long-term trends.
Nov 13, 2018
More than 80 percent of the internet surfers only visit the top five results when they search for a keyword on Google or other search engines. For that reason, the process of optimizing websites is challenging, and the fact that algorithms keep changing does not make things straightforward. In this article, I will show you why you should pay attention to the internal links and which SEO linking tool can improve it for a better ranking.
Aug 21, 2018
JetOctopus is probably one of the most efficient crawlers on the market. It’s fast and incredibly easy to use, even for a non-SEO. Its most convincing selling point is that it has no crawl limits, no simultaneous crawl limits and no project limits giving you more data for less money. If you are working on a huge database-driven website, you’ll definitely find it a money- and time-saver.
Aug 13, 2018
Interlinking is one of the most powerful tools for technical optimization of a big web-site. Thanks to interlinking factors it’s easy to get the pages several points up in SERP for super long tail keywords. On many web-sites we face up typical interlinking problems – non important pages with too much weight, overspammed or insufficient anchors distribution, whole clusters of pages with just one leading link and others.

Aug 8, 2018
Every webmaster sooner or later faces the necessity of a website’s technical audit, which is not an easy task itself and is extremely difficult with big sites.
Crawling can be easy with such crawlers as Screaming Frog and JetOctopus. You don’t have to be a professional to cope with them, even a newbie can do it. Let’s see which is more convenient.
Aug 3, 2018
Have you ever thought how many web-sites there are on the Internet and what is going on with them? I was always wondering if it is possible to see all the domains there are on the Internet instead of analyzing the given small selected number. Here are the results of my research.
Jul 24, 2018
Looking at the websites that get a lot of traffic some might think that this is due to their technical perfection. Others are convinced that the older such websites are, the more traffic they get thanks to age and trust. Let's check who is right with quickly developing fields in e-commerce – fashion industry.
Jun 12, 2018
What’s the main difference from Screaming frog? This is the most popular question we’re getting from potential clients. To cut a long story short look inside for the differences. They are game-changing.
Jun 12, 2018
When there was a need to crawl 5 mln. pages at our first business (jobagregator) and we calculated the budget for it (all money of the world) we decided to write our own crawler. What happened next is the most interesting.
Jun 6, 2018
Is it temporary or is it time to start worrying and switch to something else? The news that Yandex stopped indexing the websites created at wix.com. was first featured on June 5th. Worried webmasters started discussing the situation saying that websites builders are better not used at all anymore.