Jan 15, 2020
We are so excited by this update because it totally changes the way of interacting with GSC data. We used to watch the trends of Impressions, Clicks, CTR, Positions, etc. It gave us a general understanding of the situation in SERP but it didn’t give us any details. But this is not the case anymore! Now you will know exactly which group of URLs experienced a massive growth and which ones experienced a massive drop (impressions, CTR, clicks, active pages, etc.) You should then get the exact list of these URLs. Let’s look at the main features a little bit closer:
Nov 27, 2019
When search bot crawls your website and finds similar data on multiple URLs, it doesn’t know how to treat your content. In most cases, bot trusts the clues you give it (unless you trying to manipulate search results). So, the game plan is to specify which pages are original and which are appreciably similar. Let's find the best way to do it.
Sep 30, 2019

Logs files register each request the web server receives. If you want to get SEO insights from logs, you'll need to deal with heavy-data processing. That can only be done by an enterprise-grade logs analyzer. Jetoctopus precisely shows how any bot is crawling your pages, so you can optimize your site to increase organic visibility. Sounds great, right? Wait till we’ll show you how to get the most out of log analyzer on example of real website.

Dec 13, 2018

JetOctopus new product update is out! You can now compare two different crawl reports and see crawl evolutions! With our unique module, you will be able not only to watch SEO dynamics in overall but also to compare changes in segments of your website. This approach helps to identify the strengths and the weaknesses of each segment on your website and replicate successful experiments on the whole website. Let’s focus on real benefits you can take from our new module to maximize your ROI.

Aug 21, 2018
JetOctopus is probably one of the most efficient crawlers on the market. It’s fast and incredibly easy to use, even for a non-SEO. Its most convincing selling point is that it has no crawl limits, no simultaneous crawl limits and no project limits giving you more data for less money. If you are working on a huge database-driven website, you’ll definitely find it a money- and time-saver.
Aug 13, 2018
Interlinking is one of the most powerful tools for technical optimization of a big web-site. Thanks to interlinking factors it’s easy to get the pages several points up in SERP for super long tail keywords. On many web-sites we face up typical interlinking problems – non important pages with too much weight, overspammed or insufficient anchors distribution, whole clusters of pages with just one leading link and others.

Jun 12, 2018
When there was a need to crawl 5 mln. pages at our first business (jobagregator) and we calculated the budget for it (all money of the world) we decided to write our own crawler. What happened next is the most interesting.