When auditing a website, it is very important to understand how Googlebot views your website. After all, it sometimes happens that due to specific server settings, search engines view a different version of the page than users. In this article, we will tell you how to view a website as a Googlebot using JetOctopus.
There is only one answer: every time. During each audit and each review, you should check what part of the code is visible to search engines.
1. If you have specific settings for users from different countries. For example, you have a website https://example.com.de and a website https://example.com. According to the web server settings, you redirect users with IP addresses from Germany to the https://example.com.de website, even if they try to access the international version of the website (https://example.com). In such cases, you need to check which version the search robot views and whether all domains/subdomains/subfolders are available to it.
3. During split testing/testing of new features. If you are testing new features (for example, a new tool for managing indexing rules), be sure to check what the GoogleBot views and which corresponding HTML elements are displayed on the page when Googlebot scans the URL.
4. When preparing for migration. Using JetOctopus, you can check how Googlebot views the new version of the page even on a staging website. Before, during and after all migrations, it is important to perform such a check on different types of pages.
5. When changing a website design, site architecture, etc.
6. During a technical audit and when looking for problems with indexing or crawling by search bots. If the page has not been indexed for a long time, or it has dropped out of the index, or the search engines have not visited it for a long time, you should also check how the page is displayed for Googlebot.
As you can see, such an inspection plan is necessary and should be carried out both on a regular basis and to find the causes of various issues.
You have two options. The first is to scan the website using the Googlebot user agent. The second is to view any page from the crawl results using the URLs inspector. Below we will talk about each method separately.
You can crawl your website using the Googlebot Desktop or Googlebot Mobile user agent.
To do this, select the desired user agent when configuring the crawl.
The disadvantage is that, with this method, you will not have access to the raw code that Googlebot sees. In addition, if your web server uses a reverse DNS lookup to verify the authenticity of the bot, the crawl results may not match what Googlebots actually view.
Go to crawl results – data tables – “Pages”. You can also check any URL from logs or Google Search Console: this tool is available in all sections.
Next, select the desired URL from the list and click on the icon with three lines.
In the drop-down window, select “View as Googlebot” or “View as Googlebot with JS”.
Next, the version of the page that Googlebot sees will open in a new browser tab. To see the HTML, use ChromeDevTools.
The advantage of this method is that you can explore what code the Googlebot has rendered for a specific page.
Please note that the pages of your website may be visited by different types of Google bots using different versions of browsers. When you check how Googlebot sees your page, JetOctopus emulates the behaviour of the bot with the most common browser version.
Also, please note that if your website uses reverse DNS lookup or has security systems, such as CloudFlare, additional settings may be required on the server side or CloudFlare.
More information: How to crawl websites using Cloudflare with the Googlebot user-agent.
Enjoy using it!