• Home
  • Blog
  • Cases
  • 2 Different Realities: Your Site Structure & How Google Perceives It

2 Different Realities: Your Site Structure & How Google Perceives It

Nov 6, 2018
  • Please, share!

2 Different Realities: Your Site Structure & How Google Perceives It

3 M of possible crawling budget waste. Never seen before 3 times mismatch!

Case study templatemonster.com

About TemplateMonster:

  1. Delivering website templates on the Net since 2002
  2. 1 M pages
  3. 5 M monthly visits

The main challenge:

To find and fix technical bugs before website's migration. Know about problems to avoid pitfalls on the new CMS.

What was done:

  1. We crawled the website with JetOctopus to find technical errors.
  2. We looked at the website through Google’s ‘eyes’ with log analysis to see how search bot scans Templatemonster.com

What problems were detected?

  1. The website has 1 M pages, but there are 3 M pages which Google regularly visit, and we DON'T KNOW about these pages!
  2. There are 250K pages that aren’t crawled by Google. In this list there are valuable pages too

What recommendations we gave:

  1. Look at log analysis report and check each URL from 3 M unknown webpages manually. Then link valuable, commercial pages into the site structure.
  2. Delete useless orphaned webpages and pages with bugs. Special attention to the webpages with 5XX Status Code should be paid. Google bots visit these pages frequently and waste crawling budget on URLs with bugs.
  3. Analyze 250K pages that aren’t indexed by Google. Add links from indexable pages to the valuable pages and generate new sitemap (that is like the invitation for Googlebot to recrawl the website).

Client’s feedback:


Serge Bezborodov is a CTO of JetOctopus. He is a professional programmer with 9+ years of experience. He has worked with aggregators for 5 years - vacancies, real estate, cars. Also, he has experience in design DB architecture and query optimization. Serge has crawled more than 160 M pages, 40 TB of data with JetOctopus.

Case study. LIKAR.INFO. Dramatic 40% SEO decrease

Watch the recording of webinar about Log files analysis

  • Please, share!
Auto classified, 20m pages crawled
What problem was your SEO department working at when you decided to try our crawler?
We needed to detect all possible errors in no time because Google Search Console shows only 1000 results per day.

What problems did you find
That’s quite a broad question. We managed to detect old, unsupported pages and errors related to them. We also found a large number of duplicated pages and pages with 404 response code.

How quickly did you implement the required changes after the crawl?
We are still implementing them because the website is large and there are lots of errors on it. There are currently 4 teams working on the website. In view of this fact we have to assign each particular error to each particular team and draw up individual statements of work.

And what were they?
It’s quite difficult to measure results right now because we constantly work on the website and make changes. But a higher scan frequency by bots would mean the changes are productive. However, around one and a half months ago we enabled to index all the paginated pages and this has already affected our statistics.

Having seen the crawl report, what was the most surprising thing you found? (Were there errors you’d never thought you’d find?)
I was surprised to find so many old, unsupported pages which are out of the website’s structure. There were also a large number of 404 pages. We are really glad we’ve managed to get a breakdown of the website subdirectories. Thus we’ve made a decision which team we will start working with in the beginning.

You have worked with different crawlers. Can you compare JetOctopus with the others and assess it?
Every crawler looks for errors and finds them. The main point is the balance between the scanned pages and the price. JetOctopus is one of the most affordable crawlers.

Would you recommend JetOctopus to your friends?
We’re going to use it within our company from time to time. I would recommend the crawler to my friends if they were SEO optimizers.

Your suggestions for JetOctopus.
To refine the web version ASAP. There are a few things we were missing badly:
Thank you very much for such a detailed analysis. Currently we have been reflecting on a redirects' problem.
Do you wan’t
more SEO traffic?
All sites have technical errors which block your SEO traffic boost
I’ts an axiom
Find out the most epic errors ot your site and start fixing them
The amount of the provided information is quite enough and you can see at once where and what the problems are