Let’s talk about SEO. But not only about it
Why Website Speed Matters
Google has included site speed (and as a result, page speed) as a signal in search ranking algorithms. Speeding up websites is crucial — not just to website owners, but to all Internet users
Yes, Robots.txt can be ignored by bots and yes, it's not secure: everyone could see the content of this file entering http://www.yourwebsite.com/robots.txt. Nevertheless, well-considered robots.txt helps to deliver your relevant content to search engine bots and omit low-priority pages in SERP. At first glance giving directories is an easy task but any kind of management requires your attention and forethought. Let’s go through the most common Robots.txt mistakes and find out how to establish a constructive dialogue with Googlebot.
At Jetoctopus, we commonly work with small businesses who desperately want to expand. However, when they try to create content to match their ambitions, they find that scaling up is a massive problem. Knowing how to move from small-scale content production to managing thousands of product descriptions, internal linking strategies, and multi-channel marketing can seem like a daunting task. So what is the best course of action for companies that want to grow online? Creating top tier content in large quantities isn't easy, and it can definitely be done badly. However, it can be done. Let's explore some strategies that are proven ways to ease the stress of scaling up content production.
Head of Digital Marketing at HeadHunter about the results of regular crawling & logs analysis
Winter Spring is Coming!
Recently Google Webmaster Central Office announced a series of SEO updates with a hashtag #springiscomning in Twitter. News about the new way Google will treat pagination affected SEO community greatly. Let the dust settle and get to the bottom of this news.
SEO leader about the results of regular crawling & logs analysis
Canonical tags were created to fix duplicate content issues. In a nutshell, if you have 3 duplicate pages (or approximately similar ones) you should pick just one of them to be shown in Google. Canonicalization is a way to help Googlebot decide what page to show in the SERP. However, rel=canonical tags don’t help the search engines unless you use them properly. This tutorial walks you through how you can use JetOctopus to audit rel=canonical tags quickly and efficiently across a website.
JetOctopus new product update is out! You can now compare two different crawl reports and see crawl evolutions! With our unique module, you will be able not only to watch SEO dynamics in overall but also to compare changes in segments of your website. This approach helps to identify the strengths and the weaknesses of each segment on your website and replicate successful experiments on the whole website. Let’s focus on real benefits you can take from our new module to maximize your ROI.
How to find and fix problems in the website's structure and optimize a sitemap both for Googlebot and for users
3 M of possible crawling budget waste. Never seen before 3 times mismatch!
Case study templatemonster.com
Every webmaster sooner or later faces the necessity of a website’s technical audit, which is not an easy task itself and is extremely difficult with big sites.
Crawling can be easy with such crawlers as Screaming Frog and JetOctopus. You don’t have to be a professional to cope with them, even a newbie can do it. Let’s see which is more convenient.
- Research and forecast in IT
- Data Mining
- Big Data
- It took around 36 hours to check 252 mln. domains