• Home
  • Blog
  • Cases
  • Case study. Preply.com. 7% increase of indexability.

Case study. Preply.com. 7% increase of indexability.

Oct 9, 2018
  • Please, share!

 Case study. Preply.com. 7% increase of indexability.

How global tutor’s platform Preply.com significantly increased the number of indexed pages and corrected all hreflang tags on the website. Read the post to profit from valuable experience!

About Preply.com:

  1. 1.5 M Pages
  2. A lot of M monthly visits
  3. Tech SEO leader – Ihor Bankovsky SEO geek with 7 years of experience

Firstly, preply.com team had crawled their site with JetOctopus. Secondly, crawl report was analyzed. Data showed that it’s important to optimize hreflang tags usage on the whole website and delete old, incorrect tags.

Let’s find out why hreflang tags are important for SEO. Hreflang tag is a solution for websites that have duplicate content in different languages. Of course, it’s important to show the content in the user’s native language. For instance, a potential customer is a Frenchman and the page that shows to him in search results is English, but there’s also a French website version. You want Google to show the French web page in SERP for that Frenchman. This is the main task for hreflang tags. Below there is a summary about hreflang tags from Google Search Console help blog:

Preply team had developed a strategy to address hreflang tags problem and then implemented it step by step. Let’s look at the main stages:

  1. Crawl the whole website (around 2 M pages) with JetOctopus (very fast and easy-to-use)
  2. Find hreflang tags with links at noindex pages, pages with 404 status code with the help of crawling
  3. Create segments for each website version and write tasks for the dev team
  4. Implement all the needed fixes

The result of these actions is positively impressive:

The number of indexed pages has increased by 7% after 2 weeks and it keeps increasing!

A short but truthful feedback about JetOctopus from tech SEO leader Ihor Bankovsky:

To sum up, hreflang tag is a solution for websites that have duplicate content in different languages. If you want to optimize a multilingual website you should crawl all pages, find hreflang tags with bugs and implement all the needed fixes!

Get more useful info: How we scanned the Internet and what we’ve found


Serge Bezborodov is a CTO of JetOctopus. He is a professional programmer with 9+ years of experience. He has worked with aggregators for 5 years - vacancies, real estate, cars. Also, he has experience in design DB architecture and query optimization. Serge has crawled more than 160 M pages, 40 TB of data with JetOctopus.

Watch the recording of webinar about Log files analysis

  • Please, share!
Auto classified, 20m pages crawled
What problem was your SEO department working at when you decided to try our crawler?
We needed to detect all possible errors in no time because Google Search Console shows only 1000 results per day.

What problems did you find
That’s quite a broad question. We managed to detect old, unsupported pages and errors related to them. We also found a large number of duplicated pages and pages with 404 response code.

How quickly did you implement the required changes after the crawl?
We are still implementing them because the website is large and there are lots of errors on it. There are currently 4 teams working on the website. In view of this fact we have to assign each particular error to each particular team and draw up individual statements of work.

And what were they?
It’s quite difficult to measure results right now because we constantly work on the website and make changes. But a higher scan frequency by bots would mean the changes are productive. However, around one and a half months ago we enabled to index all the paginated pages and this has already affected our statistics.

Having seen the crawl report, what was the most surprising thing you found? (Were there errors you’d never thought you’d find?)
I was surprised to find so many old, unsupported pages which are out of the website’s structure. There were also a large number of 404 pages. We are really glad we’ve managed to get a breakdown of the website subdirectories. Thus we’ve made a decision which team we will start working with in the beginning.

You have worked with different crawlers. Can you compare JetOctopus with the others and assess it?
Every crawler looks for errors and finds them. The main point is the balance between the scanned pages and the price. JetOctopus is one of the most affordable crawlers.

Would you recommend JetOctopus to your friends?
We’re going to use it within our company from time to time. I would recommend the crawler to my friends if they were SEO optimizers.

Your suggestions for JetOctopus.
To refine the web version ASAP. There are a few things we were missing badly:
Thank you very much for such a detailed analysis. Currently we have been reflecting on a redirects' problem.
Do you wan’t
more SEO traffic?
All sites have technical errors which block your SEO traffic boost
I’ts an axiom
Find out the most epic errors ot your site and start fixing them
The amount of the provided information is quite enough and you can see at once where and what the problems are