Let’s talk about SEO. But not only about it

Aug 8, 2019

Why Website Speed Matters

Google has included site speed (and as a result, page speed) as a signal in search ranking algorithms. Speeding up websites is crucial — not just to website owners, but to all Internet users

Jun 20, 2019

Yes, Robots.txt can be ignored by bots and yes, it's not secure: everyone could see the content of this file entering http://www.yourwebsite.com/robots.txt. Nevertheless, well-considered robots.txt helps to deliver your relevant content to search engine bots and omit low-priority pages in SERP. At first glance giving directories is an easy task but any kind of management requires your attention and forethought. Let’s go through the most common Robots.txt mistakes and find out how to establish a constructive dialogue with Googlebot.

May 13, 2019

At Jetoctopus, we commonly work with small businesses who desperately want to expand. However, when they try to create content to match their ambitions, they find that scaling up is a massive problem. Knowing how to move from small-scale content production to managing thousands of product descriptions, internal linking strategies, and multi-channel marketing can seem like a daunting task. So what is the best course of action for companies that want to grow online? Creating top tier content in large quantities isn't easy, and it can definitely be done badly. However, it can be done. Let's explore some strategies that are proven ways to ease the stress of scaling up content production.

Apr 12, 2019

Head of Digital Marketing at HeadHunter about the results of regular crawling & logs analysis

Mar 27, 2019

Winter Spring is Coming!

Recently Google Webmaster Central Office announced a series of SEO updates with a hashtag #springiscomning in Twitter. News about the new way Google will treat pagination affected SEO community greatly. Let the dust settle and get to the bottom of this news.

Mar 20, 2019

SEO leader about the results of regular crawling & logs analysis

Mar 12, 2019

Canonical tags were created to fix duplicate content issues. In a nutshell, if you have 3 duplicate pages (or approximately similar ones) you should pick just one of them to be shown in Google. Canonicalization is a way to help Googlebot decide what page to show in the SERP. However, rel=canonical tags don’t help the search engines unless you use them properly. This tutorial walks you through how you can use JetOctopus to audit rel=canonical tags quickly and efficiently across a website.

Jan 30, 2019
Have you ever faced SEO drop? So did we. Guided by data of partial crawling our SEO had deleted relevant content which led to losses of traffic and positions in SERP. The lesson we learned is that partial technical site audit is useless and even harmful for websites. Why? Read below to avoid repeating our mistake.
Dec 13, 2018

JetOctopus new product update is out! You can now compare two different crawl reports and see crawl evolutions! With our unique module, you will be able not only to watch SEO dynamics in overall but also to compare changes in segments of your website. This approach helps to identify the strengths and the weaknesses of each segment on your website and replicate successful experiments on the whole website. Let’s focus on real benefits you can take from our new module to maximize your ROI.

Nov 12, 2018

How to find and fix problems in the website's structure and optimize a sitemap both for Googlebot and for users

Nov 6, 2018

3 M of possible crawling budget waste. Never seen before 3 times mismatch!

Case study templatemonster.com

Oct 29, 2018
How we identified the reasons for decrease and how we improved the situation
Oct 29, 2018
Recently we’ve read the sad post on Facebook about sudden hardware malfunction. There was no one to help, and cunning competitors used submit URL tools and as a result, the website was overthrown from the TOP organic search positions. The conclusion: the website needs to be monitored more thoroughly and the bugs fixed without delay. Today we want to present the other way to address such problems.
Oct 22, 2018
Logs are the unique data that are 100% accurate to fully understanding how Googlebot crawls the website. Deep logs analysis can help to boost indexability and ranking, get valuable traffic, improve conversions and sales.
Nov 13, 2018
Let’s face it: More than 80 percent of the internet surfers only visit the top five results when they search for a keyword on Google or other search engines. This should tell you that if your page is not at least appearing on the top page for the keywords, you are likely to have a tough time getting any traffic to your website. For that reason, the process of optimizing websites for SEO is long and challenging, and the fact that the rules and algorithms keep changing does not make things any more straightforward. In this article, I am going to show you why you should pay attention to the internal links of your site and which SEO linking tool can improve it for a better ranking.
Oct 9, 2018
How global tutor’s platform Preply.com significantly increased the number of indexed pages and corrected all hreflang tags on the website. Read the post to profit from valuable experience!
Aug 21, 2018
JetOctopus is probably one of the most efficient crawlers on the market. It’s fast and incredibly easy to use, even for a non-SEO. Its most convincing selling point is that it has no crawl limits, no simultaneous crawl limits and no project limits giving you more data for less money. If you are working on a huge database-driven website, you’ll definitely find it a money- and time-saver.
Aug 13, 2018
Interlinking is one of the most powerful tools for technical optimization of a big web-site. Thanks to interlinking factors it’s easy to get the pages several points up in Google search results for long tail and super long tail keywords.
Aug 8, 2018
Due to active development of SEO marketing all over the world SEO experts are searching for the most effective SEO tools.
Every webmaster sooner or later faces the necessity of a website’s technical audit, which is not an easy task itself and is extremely difficult with big sites.
Crawling can be easy with such crawlers as Screaming Frog and JetOctopus. You don’t have to be a professional to cope with them, even a newbie can do it. Let’s see which is more convenient.
Aug 3, 2018
  • Research and forecast in IT
  • Data Mining
  • Big Data
  • It took around 36 hours to check 252 mln. domains
Jul 24, 2018
Looking at the leading websites that get quite a lot of traffic some might think that this is due to their technical perfection. Others are convinced that the older and the bigger such websites are, the more technical problems there might arise and as to the huge amount of traffic - they get it exceptionally thanks to their age and trust. We decided to check who is right with one of the most quickly developing fields in e-commerce – fashion industry.
Jun 12, 2018
What’s the main difference from Screaming frog? This is the most popular question we’re getting from potential clients. To cut a long story short look inside for the differences. They are game-changing.
Jun 12, 2018
When there was a need to crawl 5 mln. pages at our first business (jobagregator) and we calculated the budget for it (all money of the world) we decided to write our own crawler. What happened next is the most interesting.
Jun 6, 2018
Is it temporary or is it time to start worrying and switch to something else?