• Home
  • Blog
  • Cases
  • How to migrate a site without SEO drop. Case study

How to migrate a site without SEO drop. Case study


Nov 11, 2019
  • Please, share!

How to migrate a site without SEO drop. Case study

Changing a site’s CMS rarely goes smoothly. Such issues as broken links, error HTTP responses and duplications often come out and cause SEO drops for months. The good news is that it’s still possible to move your site with minimum losses of traffic. In this case, we share a well-worked migration strategy of our client’s website. Take notes and memorize it word for word or don’t do that. Anyway, you will find actionable tips on how to minimize risks during site’s migration.

What’s inside:

About the challenges:

Our client’s site is an e-commerce platform with 10 years of history, 300K indexed pages. During the decade, teams of developers have changed the site’s structure numerous times and thus multiplied orphaned pages, redirects and ‘Page not found’ 404 errors. The site was like a Pandora’s box, so to speak.

So as not to migrate all these bugs on a new version, SEOs needed to find technical issues and prioritize them. I asked SEO experts via social media on what technical parameters you should pay special attention to before migration and included useful pieces of advice in this case. On the after-launch phase, it was crucial to see how bots perceived the site on a new CMS to eliminate any technical issues before losing the site’s credibility in Google’s eyes.

Given the complexity of the challenge, team of SEOs together with JetOctopus tech geek Serge Bezborodov developed a plan of migration. Fortunately, it worked pretty well! If you’re planning to move a site, check out the case to take benefit from this positive experience.

A site’s migration Win strategy

Before the release

1. Сlean up the mess in structure

Before moving content to a new CMS, it’s essential to get rid of useless data. For that aim, SEOs crawled the site to see the full structure and integrated historical log files to understand whether bots perceive URLs as it was intended. The reality was shocking: there were 2,5 M of orphaned URLs:

Source: report in crawler JetOctopus

What to do

Explore content on orphaned pages and separate useful URLs from non-profitable ones. Then return useful URLs in structure by inserting interlinks from relevant indexable pages. Don’t forget to get rid of useless content by closing it with noindex tags/robots.txt (depends on the volume of similar URLs). To avoid mistakes in directives, check out our guide on common robots.txt pitfalls.

When developers followed these recommendations, the number of orphaned pages significantly decreased:

Source: report in log analyzer JetOctopus

The moral here is that you may think you know your website well but in reality, bots could spend their resources on URLs thatyou don’t even know about.

2. Optimize internal links

Before launching a new version of a site, it’s crucial to check whether pages are linked properly. The thing is when developers open access to a website, bots start a crawl from the given page and discover new pages through the links. So as not to go into details, I link to a detailed explanation of crawling and indexation process from Google’s blog.

Based on our yearlong research of 300M crawled pages, the more links are on the page, the more often this page is visited and crawled by bot. 1-10 links on a page of a big site aren’t enough for being authoritative in Google’s eyes.

On our client’s website, there were 55K pages with only 1 internal link and 50K pages with less than 10 links.

What NOT to do

Don’t insert the extreme number of random links across your website.

What to do

Get back to the plan of site’s structure and think from what section it is better to insert links so as to retain a relevant structure. In case you don’t know how to benefit from visualized site’s structure, here is a good article about how to organize your website in a way that clearly communicates topic relevance to search engines.

When SEOs of analyzed site prioritized the list of URLs with poor interlinking structure and added relevant links on those pages, number of bots’ visits increased. There are still some poor-linked URLs left but these pages aren’t so imported as optimized ones and could wait till high-priority SEO tasks will be done.

3. Keep calm and love redirects

Correctly redirecting your pages is one of the crucial things you can do to make a site migration go smoothly. Broken redirects and chains are the wide-spread issues on big e-commerce sites.

Source: report in crawler JetOctopus

Developers reconsidered the site structure to make it more users-friendly and then redirected relevant pages from the old version with permanent 301 Status Codes.

After the release

Yep, so far we’d done a great deal of work but it was too soon for a victory celebration. There are always some bugs appear unexpectedly after a site’s migration. The earlier you find technical issues, the bigger chances to avoid an SEO drop.

1. Check whether everything is OK. Now come back and check again

Remember that developers aren’t robots but humans who make mistakes from time to time.

For instance, developers of our client’s site forgot to delete meta tag noindex from huge part of the site:

Fortunately, SEOs quickly found this blunder mistake in a live stream of bot’s visits in logs. The moral here is that a comprehensive audit of your server log files is crucial after launching the site.

2. Checked twice? Do it for the third time

No, I don’t want to sound like a broken record. I’m just trying to say that site on a new CMS often works differently, and that a few months can pass till search bot will crawl all pages on a big website. During that time you should keep your eyes wide open so that all URLs being indexed right.

Our client’s site faced such issue as ‘blinking’ Status Code. That is when, for instance, bot crawls the page and gets 200 Ok code, then recrawled it and gets 500 error.

Source: datatable in crawler JetOctopus

If there are a bunch of blinking HTTP responses, bot will treat your website as unstable and demote it in SERP. SEOs shared a list of problematic URLs with developers who eliminated the reason for the issue in turn.

Results

In this part, I won’t make a speech and just will show off the statistics of total organic visits in the last six months. Migration was made on July 1, 2019:

Source: analytics data on Similar web

The number of visits that bot made on the site as well as the number of crawled pages has increased since July:

Source: data from server logs analysis

Bonus:

Do you agree that expert SEO advice is worth its weight in gold? That’s why I asked SEO influencers to name tech parameters that are crucial on the pre-launch stage. Here are the most useful insights:

  • URL structure (as I would want the ability to leave the old structure)
  • Ability to add/change attributes in the head on a page level (canonicals, meta-robots)
  • The overall performance (e.g. in terms of speed)
  • Kristina Azarenko - SEO expert with 10 years of hands-on experience. Author of marketingsyrup.com

    • Content migration
    • URL & Content consistency (e.g. on-page, titles, meta)
    • Redirects (most important if content changing)
    • Site speed
    • Robots.txt & Noindex
    • Technical carryover or upgrades (if you have enhancements in place you want them to stay in place)

    Jacob Stoops - Sr. SEO Manager at SearchDiscovery. Over a decade in SEO. Host of Page2Podcast

    • Clean up navigation, url structure, hierarchy
    • Export all url's and matching keywords (also rankings)
    • Redirect plan
    • Change internal links (relative path)
    • Validate each url has got a self referring canonical / or designated canonical
    • Check if meta data and pagination is correct for new pages + still correct for migrated pages
    • Update robots.txt file
    • Change XML sitemap
    • Check feeds to third parties (such as Channable f.e.)
    • Validate SSL certificate
    • Remove all unnecessary / old scripts
    • As clean as possible page source (in terms of performance)
    • Change / implement trackers as Analytics, GTM
    • Change (if relevant) structured data information to new domain / setup

    Marc van Herrikhuijzen. SEO strategist at BCC Elektro-speciaalzaken BV Also, Mark kindly gave his advice about Post-migration optimization. You can read his post in Technical SEO group on Linkedin

    I'll add: make sure the CMS doesn't create fragments or nodes (or if it does, that you account for that). Common in Drupal, custom post types in WordPress, etc.

    Jenny Halasz. SEO & Analytics Expert; President JLH Marketing. Professional Speaker

    I'd probably add checking media behaviour (WP might create attachment pages for example), reviewing robots (some like Shopify are locked-down), XML sitemap behaviour

    Matt Tutt Holistic digital marketing consultant specialising in SEO from Bournemouth in the UK

    Don't forget to renew old domains that have been migrated. These old domains often have lots of links pointing to them, and losing those is likely to cause HUGE ranking drops.

    Steven van Vessum. VP of Community at Content King, contributor to SEJ, Content Marketing Institute, and CMSwire

    To sum up

    SIte’s migration is a complex process. That is when the data-driven SEO approach is a MUST. To make this task a piece of cake, divide it into two stages: BEFORE and AFTER the release.

    On a pre-launch stage, conduct a comprehensive technical audit to find bugs that could harm your SEO on a new site’s version. Pay special attention to:

    • website’s structure,
    • interlinking optimization,
    • redirects implementation.

    On the after-launch stage, verify that all goes according to the plan by monitoring Google’s reaction and end-users visits trends. That will help you to detect technical issues and eliminate them as soon as possible.

    As you can see, migration of a big site won’t be a nightmare if you approach this issue in a comprehensive way. This is when the analytics tool is absolutely necessary. I won’t name the best tool for technical SEO audit - I’ll just leave a link to a 7-day free trial.

    About the author:

    Ann Yaroshenko is a Content Marketing Strategist at JetOctopus сrawler. She has Master’s diploma in publishing and editing and Master’s diploma in philology. Ann has two years experience in Human Resources Management. Ann is a part of JetOctopus team since 2018.

    Wanna learn more? Here is how to optimize hreflangs and thus increase indexability. Case study

    • Please, share!
    Find technical SEO errors at your site and fix them. Try It