June 9, 2022
by Ann
How to use segments
Using JetOctopus, you can customize as many segments as you need. We do not limit the number of segments. You can use URL segments when analyzing the results of crawling, logs...
June 8, 2022
by Sofia
Why the results of crawls differ
If you have crawled the same website twice or more, you may notice that the crawl results are different. In this article, we will explain why this happens and what to keep in...
June 7, 2022
by Sofia
How does the JetOctopus crawler work?
Understanding how the JetOctopus crawler works will help you better understand the results of your crawling. The main principle of our crawler is very similar to search...
June 1, 2022
by Sofia
How to configure crawl for JavaScript websites
We recommend using advanced settings to crawl JavaScript websites. Thanks to the specific settings you will be able to analyze your JS version in more detail.  We've...
May 17, 2022
by Sofia
How to crawl websites using Cloudflare with the Googlebot user-agent
If you crawl your website with a regular JetOctopus user agent, it will not be blocked by Cloudflare. However, if you want to check how your site is seen by Google bot using...
May 11, 2022
by Sofia
Product Update. GSC by Countries
For large multinational/multilingual websites, it can be difficult to track traffic dynamics in different countries. Monitoring the performance of all countries in total is an...
May 9, 2022
by Sofia
How to generate a sitemap with JetOctopus and how to submit it to Google
If you find errors in crawl results or logs, you can submit the corrected/updated pages for scanning to search engines. This can be done with a custom sitemap. JetOctopus allows...
May 6, 2022
by Sofia
What if you run a crawl and your website returns the 503 response code?
Happens when you run a crawl and the website returns a 503 response code. “503 Service Unavailable” response code means that your web server cannot process the request at this...
May 2, 2022
by Sofia
How to configure a crawl of your website
Before you start a new crawl we recommend preparing basic information about your website. It allows you to set up the optimal configuration of the crawl. Of course, the basic...
April 29, 2022
by Sofia
How to find URLs blocked by robots.txt
We want to have clean and accessible code on our website. Also, we want search engines to scan only pages needed in search results because we worry about our crawling budget...
Search
Categories
Get exclusive tech SEO insights
We are tech SEO geeks who believe that SEO is predictable and numeric. Don’t miss our insigths!