Let’s talk about SEO. But not only about it
Logs files register each request the web server receives. If you want to get SEO insights from logs, you'll need to deal with heavy-data processing. That can only be done by an enterprise-grade logs analyzer. Jetoctopus precisely shows how any bot is crawling your pages, so you can optimize your site to increase organic visibility. Sounds great, right? Wait till we’ll show you how to get the most out of log analyzer on example of real website.
Google has included site speed (and as a result, page speed) as a signal in search ranking algorithms. Speeding up websites is crucial — not just to website owners, but to all Internet users. JetOctopus team encourages you to start looking at your webpages speed — not only to improve your ranking in Google, but also to improve user's experience and increase profitability.
Yes, Robots.txt can be ignored by bots and yes, it's not secure: everyone could see the content of this file. Nevertheless, well-considered robots.txt helps to deliver your content to bots and omit low-priority pages in SERP. At first glance, giving directories is an easy task but any kind of management requires your attention. Let’s go through common Robots.txt mistakes and ways to avoid them
Recently Google Webmaster Central Office announced a series of SEO updates with a hashtag #springiscomning in Twitter. News about the new way Google will treat pagination affected SEO community greatly. Let the dust settle and get to the bottom of this news.
Canonical tags were created to fix duplicate content issues. In a nutshell, if you have 3 duplicate pages (or approximately similar ones) you should pick just one of them to be shown in Google. Canonicalization is a way to help Googlebot decide what page to show in the SERP. However, rel=canonical tags don’t help the search engines unless you use them properly. This tutorial walks you through how you can use JetOctopus to audit rel=canonical tags quickly and efficiently across a website.
JetOctopus new product update is out! You can now compare two different crawl reports and see crawl evolutions! With our unique module, you will be able not only to watch SEO dynamics in overall but also to compare changes in segments of your website. This approach helps to identify the strengths and the weaknesses of each segment on your website and replicate successful experiments on the whole website. Let’s focus on real benefits you can take from our new module to maximize your ROI.
While most of SEOs share an understanding that submitting a sitemap to Google Search Console is important, they may not know the intricacies of how to implement them in a way that drives SEO KPIs. Here is how to find and fix problems in the website's structure and optimize a sitemap both for Googlebot and for users.
Crawling can be easy with such crawlers as Screaming Frog and JetOctopus. You don’t have to be a professional to cope with them, even a newbie can do it. Let’s see which is more convenient.