• Home
  • Blog
  • Product
  • Product Update: Linking Explorer – a database on interlinking for each page

Product Update: Linking Explorer – a database on interlinking for each page

Aug 13, 2018
  • Please, share!

Product Update: Linking Explorer – a database on interlinking for each page

Interlinking is one of the most powerful tools for technical optimization of a big web-site. Thanks to interlinking factors it’s easy to get the pages several points up in Google search results for long tail and super long tail keywords.

Unfortunately, on many web-sites we face up typical interlinking problems – non important pages with too much weight, overspammed or insufficient anchors distribution, whole clusters of pages with just one leading link and others.

One of the main problems when working with interlinking is links volumes. For instance, an average web-site with 1 mln pages may have around 200 mln links and as you can guess it’s quite inconvenient to work with them via csv files or Excel.

Most crawlers have such an indicator as “number of internal links to the page” – but the question is what links are those? How many of them are image links, how many are text ones, what anchors lead to the page, are those links from indexed pages or are they from non-canonical ones that add no internal weight?

We are glad to present Linking Explorer

This is a new tool for innerlinking analysis. It takes the data right from the link graph and shows aggregated information on incoming links, anchors distribution, what directories they come from and click distance.

It can:

  1. Show information on a separate url of a web-site http://site.com/buy-black-iphone-la or on url mask http://site.com/*-iphone-*

    The opportunity of working with url masks makes your work with page clusters much easier.

    Example: YOU sell iphones, a wide range of models and colors. That means you’ve got lots of pages dedicated to Iphone. It is rather troublesome to check each link separately. Now you can simply set “iphone”url mask and get a full picture on the whole Iphone cluster.
  2. Show characteristics of all the incoming links - % nofollow, text and image links, unique pages

    Importance, content, weight
  3. Show the distribution of the anchors leading to the page

    Google analyzes the content and semantics of a landing page by the anchors leading to it. With the help of anchors it is possible to expand the page semantics. And as you can guess, when analyzing the anchors leading to the page we can either increase this list or remove the non-relevant spammed anchors.
  4. Show the distribution of the image alt attributes leading the page

    Google analyzes the content and semantics of a landing page by image alt tags leading to it. With their help it is possible to expand the page semantics. So when analyzing this parameter we can either increase image alt or delete the non-relevant ones.
  5. Show the distribution of incoming directories

    It can show the directories of those pages that contain links to the landing page. Example: if the landing page is “Buy a chair in London”, then such incoming directories as “Buy a car in Edinburgh” are hardly acceptable.
  6. Show the click distance from the main page

    It can show the click distance from index of those pages that contain links to the landing page. The click distance for important pages should be as close to index as possible.


  • This tool can work in segments mode on the web-site.
    For example: you create a segment “Cell phones” (url contains “cell phone”). Staying in this segment Linking Explorer will show you the information on the links to the landing page exclusively from this segment.
  • You can also analyze external urls.
    For instance: you have a related web-site site.de and now you can see the way it is linked with site.com. That means you’ll get all the mentioned above data from the crawled web-site for the landing page on the external site.

Everything that you can see is clickable. Clicking a title you go to the Linking Explorer datatable where you can adjust all the necessary filters.


With Linking Explorer the process of interlinking becomes thousand times easier. All the important information you need is presented in just one tool and with absolutely no limits!

Get more useful info Product Update: Compare Crawls


Serge Bezborodov is a CTO of JetOctopus. He is a professional programmer with 9+ years of experience. He has worked with aggregators for 5 years - vacancies, real estate, cars. Also, he has experience in design DB architecture and query optimization. Serge has crawled more than 160 mln pages, 40 TB of data with JetOctopus.

  • Please, share!
Auto classified, 20m pages crawled
What problem was your SEO department working at when you decided to try our crawler?
We needed to detect all possible errors in no time because Google Search Console shows only 1000 results per day.

What problems did you find
That’s quite a broad question. We managed to detect old, unsupported pages and errors related to them. We also found a large number of duplicated pages and pages with 404 response code.

How quickly did you implement the required changes after the crawl?
We are still implementing them because the website is large and there are lots of errors on it. There are currently 4 teams working on the website. In view of this fact we have to assign each particular error to each particular team and draw up individual statements of work.

And what were they?
It’s quite difficult to measure results right now because we constantly work on the website and make changes. But a higher scan frequency by bots would mean the changes are productive. However, around one and a half months ago we enabled to index all the paginated pages and this has already affected our statistics.

Having seen the crawl report, what was the most surprising thing you found? (Were there errors you’d never thought you’d find?)
I was surprised to find so many old, unsupported pages which are out of the website’s structure. There were also a large number of 404 pages. We are really glad we’ve managed to get a breakdown of the website subdirectories. Thus we’ve made a decision which team we will start working with in the beginning.

You have worked with different crawlers. Can you compare JetOctopus with the others and assess it?
Every crawler looks for errors and finds them. The main point is the balance between the scanned pages and the price. JetOctopus is one of the most affordable crawlers.

Would you recommend JetOctopus to your friends?
We’re going to use it within our company from time to time. I would recommend the crawler to my friends if they were SEO optimizers.

Your suggestions for JetOctopus.
To refine the web version ASAP. There are a few things we were missing badly:
Thank you very much for such a detailed analysis. Currently we have been reflecting on a redirects' problem.
Do you wan’t
more SEO traffic?
All sites have technical errors which block your SEO traffic boost
I’ts an axiom
Find out the most epic errors ot your site and start fixing them
The amount of the provided information is quite enough and you can see at once where and what the problems are