What You Should Know About rel=“next” and rel=“prev” Google Update

Mar 27, 2019
  • Please, share!

What You Should Know About  rel=“next” and rel=“prev” Google Update

Winter Spring is Coming!

Recently Google Webmaster Central Office announced a series of SEO updates with a hashtag #springiscomning in Twitter. News about the new way Google will treat pagination affected SEO community greatly. Let the dust settle and get to the bottom of this news.

On March 21, 2019 @googlewmc posted the following tweet:

Vague information led to misunderstandings:

Google’s web engineer Ilya Grigorik gave explanations:

In other words, you don’t need to delete all rel=prev/next links you’ve implemented for so long. Note that these links attributes are a part of a web standard and that there are other search engines bots besides Googlebot. For instance, Bing ’s webmaster Frédéric Dubut mentioned search engine is using rel=prev/next as hint for crawling pages and understanding the website structure, but not to group paginated pages or rank them.

Since these links attributes are a part of W3C standard and not just Google created, it’s best to keep everything as it is. Anyway, it’s a good time to analyze your site structure comprehensively as Google’s understanding of websites hierarchy is being improved.

Are there any changes in JetOctopus algorithms?

JO crawler discovers paginated pages with the help of rel=prev/next links attributes:

For now, JO tech team thinks to keep everything working the way it does at the moment. Pagination rel=prev/next signals are still useful for other search engines — and, if paginated pages are just ‘normal pages’ now, then it makes it even more important not to noindex them.

Get more useful info How to Create a Site That Will Never Go Down


Ann Yaroshenko is a Content Marketing Strategist at JetOctopus. She has Master’s diploma in publishing and editing and Master’s diploma in philology. Ann has two years experience in Human Resources Management. Ann is a part of JetOctopus team since 2018.

  • Please, share!
Auto classified, 20m pages crawled
What problem was your SEO department working at when you decided to try our crawler?
We needed to detect all possible errors in no time because Google Search Console shows only 1000 results per day.

What problems did you find
That’s quite a broad question. We managed to detect old, unsupported pages and errors related to them. We also found a large number of duplicated pages and pages with 404 response code.

How quickly did you implement the required changes after the crawl?
We are still implementing them because the website is large and there are lots of errors on it. There are currently 4 teams working on the website. In view of this fact we have to assign each particular error to each particular team and draw up individual statements of work.

And what were they?
It’s quite difficult to measure results right now because we constantly work on the website and make changes. But a higher scan frequency by bots would mean the changes are productive. However, around one and a half months ago we enabled to index all the paginated pages and this has already affected our statistics.

Having seen the crawl report, what was the most surprising thing you found? (Were there errors you’d never thought you’d find?)
I was surprised to find so many old, unsupported pages which are out of the website’s structure. There were also a large number of 404 pages. We are really glad we’ve managed to get a breakdown of the website subdirectories. Thus we’ve made a decision which team we will start working with in the beginning.

You have worked with different crawlers. Can you compare JetOctopus with the others and assess it?
Every crawler looks for errors and finds them. The main point is the balance between the scanned pages and the price. JetOctopus is one of the most affordable crawlers.

Would you recommend JetOctopus to your friends?
We’re going to use it within our company from time to time. I would recommend the crawler to my friends if they were SEO optimizers.

Your suggestions for JetOctopus.
To refine the web version ASAP. There are a few things we were missing badly:
Thank you very much for such a detailed analysis. Currently we have been reflecting on a redirects' problem.
Do you wan’t
more SEO traffic?
All sites have technical errors which block your SEO traffic boost
I’ts an axiom
Find out the most epic errors ot your site and start fixing them
The amount of the provided information is quite enough and you can see at once where and what the problems are