• Home
  • Blog
  • Crawlers
  • What’s the difference between JetOctopus and other crawlers?

What’s the difference between JetOctopus and other crawlers?


Jun 12, 2018
  • Please, share!

What’s the difference between JetOctopus and other crawlers?

What’s the main difference from Screaming frog? This is the most popular question we’re getting from potential clients. To cut a long story short look inside for the differences. They are game-changing.

Fast and easy-to-use SaaS as a desktop crawler

Why do you like desktop crawlers? They are extremely easy to use!
Why do you hate SaaS? After you’ve made all the necessary settings and you press “start crawling” it seems that you’ve just launched a spaceship – no less.
But it’s completely different with JetOctopus – two clicks and it starts crawling your site immediately. Not in an hour, not in half a day, but immediately. And you can watch the status update every second.

“Issues” and “Analytics” sections

Let’s imagine a SEO agency has got a new customer or you’ve just put a new site into production. At this point information about average length of non-canonical page titles is the least interesting.

What is really acute now is the most critical issues like 99% of the site is closed for indexing by mistake, or half the site goes to canonical of the index page, duplicate titles and etc.

That’s why we made “Issues” section where you can find all the collected data on all the existing problems. In just one click you can download the list of 5xx pages or of pages with duplicate html code.

When you need ideas for optimization or need to study the site more thoroughly – “Analytics” section will be of much help. Over 202 charts in 26 reports will provide you with all the required information about the site. It is impossible to create a check list of all issues that would fit every site. It is impossible to tell from the majority of SEO parameters whether the site is in the green or red zone. But when an SEO expert is working with the site he can replace the changed average number of unique word in title at the 8th nesting level and see if there are any real issues. That’s what section “Analytics” is made for.

Segmentation

Sometimes when a site has got many subsections it happens that they are managed by different development teams. It is clear that you’ve got to handle issues and analytics in the framework of each section. That’s why you need “Segments”. Thus you can take a snapshot of the site with the required filters (over 120). You don’t have to crawl the whole site again and wait for hours for the new results. You make a new segment and start working with it.

JetOctopus is a SaaS Crawler after all

During our cooperation with SEO agencies we often faced the following situation: there’s a powerful computer with 16-32Gb RAM and an installed desktop crawler. Office workers enter the date when they’d like to use the crawler in Google Calendar. After the crawling they start doing the monkey job in order to bring the information together in lots of Excel tables.

To work with SaaS you don’t need a big computer on Windows or fast Internet. Just think of this – when you see the error in a desktop crawler “Page Loading Timeout Error” – is it a real error or your office cleaner has unplugged the router by accident?

We work with servers with 1-10 Gb/sec channels all over the world. And if a page doesn’t load due to timeout we try again after some time, and then once more. Just like Googlebot does. And if it still doesn’t load in the long run, you’ll see it in a separate report which is to go to technicians.

No Limits

We don’t like limits. That’s why we don’t have them for projects, crawls, domains, segments, logs and there will be none for any integrations – Google Analytics, Webmaster Tools and etc.

We want our customers – big sites, SEO agencies, SEO experts – to think of how they can make their sites work better so that their customers could earn more, and not to choose which site to crawl now because there’s only one available project left.

Conclusion

It’s not difficult to make a crawler that will give you tons of information and hundreds of diagrams and reports. What is difficult is to answer the question that you will ask yourself, “What am I to do with it now?”

That’s why while developing our product we kept thinking of a problem-oriented crawler – user shouldn’t spend hours in Excel trying to bring together all the collected information and then give it to a development team to fix. Monkey job should be done by computers, not people.

Try our 7-days free trial version and feel the difference.

Get more useful info Screaming Frog vs. JetOctopus review & comparison

ABOUT THE AUTHOR

Julia Nesterts is co-founder of JetOctopus Crawler. She is a Project Manager with 9+ years of experience in business strategy, web product development, optimization of core funnels. Julia is also expert in product, social, email, word-of-mouth, content marketing.She successfully deals with client's support, team management and development.

  • Please, share!
auto.ria.com
Auto classified, 20m pages crawled
Duplications
What problem was your SEO department working at when you decided to try our crawler?
We needed to detect all possible errors in no time because Google Search Console shows only 1000 results per day.

What problems did you find
That’s quite a broad question. We managed to detect old, unsupported pages and errors related to them. We also found a large number of duplicated pages and pages with 404 response code.

How quickly did you implement the required changes after the crawl?
We are still implementing them because the website is large and there are lots of errors on it. There are currently 4 teams working on the website. In view of this fact we have to assign each particular error to each particular team and draw up individual statements of work.

And what were they?
It’s quite difficult to measure results right now because we constantly work on the website and make changes. But a higher scan frequency by bots would mean the changes are productive. However, around one and a half months ago we enabled to index all the paginated pages and this has already affected our statistics.

Having seen the crawl report, what was the most surprising thing you found? (Were there errors you’d never thought you’d find?)
I was surprised to find so many old, unsupported pages which are out of the website’s structure. There were also a large number of 404 pages. We are really glad we’ve managed to get a breakdown of the website subdirectories. Thus we’ve made a decision which team we will start working with in the beginning.

You have worked with different crawlers. Can you compare JetOctopus with the others and assess it?
Every crawler looks for errors and finds them. The main point is the balance between the scanned pages and the price. JetOctopus is one of the most affordable crawlers.

Would you recommend JetOctopus to your friends?
We’re going to use it within our company from time to time. I would recommend the crawler to my friends if they were SEO optimizers.

Your suggestions for JetOctopus.
To refine the web version ASAP. There are a few things we were missing badly:
Thank you very much for such a detailed analysis. Currently we have been reflecting on a redirects' problem.
ultra.by
Do you wan’t
more SEO traffic?
All sites have technical errors which block your SEO traffic boost
I’ts an axiom
Find out the most epic errors ot your site and start fixing them
The amount of the provided information is quite enough and you can see at once where and what the problems are
auto.ria.com