Screaming Frog vs. JetOctopus review & comparison


Aug 8, 2018
  • Please, share!

Screaming Frog vs. JetOctopus review & comparison

Due to active development of SEO marketing all over the world SEO experts are searching for the most effective SEO tools.
Every webmaster sooner or later faces the necessity of a website’s technical audit, which is not an easy task itself and is extremely difficult with big sites.
Crawling can be easy with such crawlers as Screaming Frog and JetOctopus. You don’t have to be a professional to cope with them, even a newbie can do it. Let’s see which is more convenient.

The size of your site

Knowing how big you think your site is and how big it is, in reality, is two different things, but at least roughly knowing the size should help. One client thought they had less than 1,000 pages. As a default when using JetOctopus (JO) crawling limit was set it to 10,000 and the client was shocked to see that his site had over 10,000 pages.

Screaming Frog – it all depends on how good your computer is to whether you can crawl your entire site. SF is a desktop application and it uses the resources of your machine. Some people had SF installed on dedicated servers to run large sites. For sites which are 1,000 pages or less SF will be sufficient enough, even on MacBook Air you can manage to crawl a 10,000-page site – but it does take a while and means you can’t use your PC.

JetOctopus – is a cloud-based solution with the extremely fast crawling speed (up to 200 pages per second). So it doesn’t matter what device you use. It is really Saas solution and doesn't use memory of your PC. Start crawling and close it going back to work. When crawling is finished you will receive email.

*It should be mentioned that big websites need at least one crawl per 2-3 months.

Scheduling of Crawls

You should schedule crawls to reduce admin and ensure you have regular data coming in. By scheduling crawls you’ll never be without fresh data.

Screaming Frog – A common misunderstanding is that you can’t schedule crawl with Screaming Frog, it can be done. It’s just a lot more complicated and difficult to set up and helps if you have a dedicated server. If you have a server you can “Schedule tasks” you want to be done at specific times following a long help instructions.

JetOctopus – When setting up the project, you can set up how often you want the crawl to run and on what day. You can also change this at any point.

Most of crawls you can set to run on a Sunday, so when you get into the office on Monday crawling data is waiting for you to analyse. Easily see what's changed, for better or worse, since your last crawl.


Winner – just because of how simple it is to set up a schedule JetOctopus has to be the best out of the two here.

Сompleteness of analysis

Unless you crawl the entire website, you can’t estimate the technical state of your website correctly. The bigger the website is, the harder is to conduct a comprehensive audit. Partial analysis is useless and even harmful for your website and here is why

Screaming Frog - crawler’s capacity is limited by capacity of your PC. While it’s not a problem to analyze a small website with a few hundred pages, comprehensive audit of big e-commerce projects will be a real pain.

JetOctopus is the unlimited SaaS solution which runs on its own servers. Cutting-edge technologies allow crawling 10 K pages for less than a minute. Instead of trying to combine the data of partial crawl, you get the full-scale audit of each technical parameter.

Crawling a list of URLs

Sometimes you don’t want to crawl a website but a list of URLs, there are many reasons why, for checking the current status of links, analysing to see if 404 errors have been resolved etc.

Screaming Frog – pretty easy to do just change the Mode from “Spider” to “List” and paste in the list of URLs you want to crawl.

JetOctopus – there are 154 filters in Datatable to work with crawled urls, pages. Just set any parameters you need and get the whole picture.

This is a great feature for server log analysis. Sometimes when doing server log analysis you could be looking at several months worth of data so sometimes it handy to check the current status of pages which could have been fixed since the errors.

Number of different Sites

This feature will depend on your business, if you only own one website this isn’t an issue, but even if you work in-house you might have subsites. If you work for an agency or manage multiple affiliate sites then the number of sites you can crawl is crucial.

Screaming Frog – Unlimited – the only drawback is the power of your machine and you have to do one crawl at once. No simultaneous crawls possible. So it pretty much time to crawl all sites you need. Also you get only two keys per licence. Overall time for crawling, for analysis of crawling data and for technical optimization at the end of the day is pretty much long. And increase of SEO traffic is far.

JetOctopus – Unlimited – the advantage is you can crawl multiple different sites at once. No limits for simultaneous crawls. No project limits. Having bought 1 package you can crawl simultaneously as many websites as you need within your urls limit.

So, it is up to you to pick up the winner. It all depends on how many websites you have for regular crawling. For agencies and many-sites owners JetOctopus is a definite winner.

Finding Errors

After all, this is probably the main reason to use any of these tools - to find any errors and fix them. They both find all the main issues like 404s, 301s etc.

Screaming Frog – is rather good for finding 404 pages, redirects, duplicate titles, broken links, it takes some time to find and extract the errors from the crawling report. It’s just an intensive labour but it is about your experience of work with SF, it is a trainable skill.

JetOctopus – can do all that too, but unlike Screaming Frog JetOctopus is aimed at a deeper analysis and more sophisticated technical SEO. No other crawler collects as much data as JetOctopus does. There are 154 filters now that help to adjust the analysis parameters individually for your case and all this information is visualized so that you could benefit from it as quick as possible. No need to spend tons of time for extracting data. Even an average SEO specialist is able to make the analysis of crawling data. Which is an extremely important issues for digital agencies when there are plenty of clients.

Winner – while both are very good, but the dashboard of JO allows you to quickly see the errors and get them fixed.

The unique and absolute advantages of JetOctopus

  • JO has an option of scheduling a regular crawl, which is really convenient. Let’s imagine, your site is actively developing and there do occur errors from time to time, but you learn about them only from Webmaster Console – that means a SE has already indexed and penalized you for them. And it takes ages to get rid of these errors from SE. It’s much better when your website gets a scheduled crawl for example every Sunday without your participation, sends you an email with the results, and if you see some negative changes you can quickly eliminate or fix them without even letting Google know there were any.
  • JO has a very nice option that is rather convenient when working with big sites – Segments section. This feature allows you to slice any site into any number of business-oriented pieces you wish and to work with each separately.
  • Datatable – 154 filters will help you extract any problems from your huge crawling report: pages with empty titles, not canonical pages, pages with low load time, duplicated descriptions etc.
  • Another absolute advantage of JO is its unlimitedness (without project, crawls, simultaneous crawls, segment, log analyzer limits). Just put your websites on regular crawl, analyze, optimize, monitor all the parameters and forget about limits.

Overall

JetOctopus is a new advanced crawling tool that can meet your any requirements and needs, no matter whether you are a skillful managing director of several websites with a SEO team or just a newbie with a first startup website. It is up to you to decide how to promote your business, how to monitor your sites and etc., but JetOctopus can help. Smart, strategic, cost-effective SEO. There is even a Starter package “No SF reality” which costs the same money as Screaming frog, has 100 000 urls to crawl. This is the best option to use all the power of JO and to compare yourself.

Get more useful info What’s the difference between JetOctopus and other crawlers?

ABOUT THE AUTHOR

Julia Nesterts is co-founder of JetOctopus Crawler. She is a Project Manager with 9+ years of experience in business strategy, web product development, optimization of core funnels. Julia is also expert in product, social, email, word-of-mouth, content marketing.She successfully deals with client's support, team management and development.

  • Please, share!
auto.ria.com
Auto classified, 20m pages crawled
Duplications
What problem was your SEO department working at when you decided to try our crawler?
We needed to detect all possible errors in no time because Google Search Console shows only 1000 results per day.

What problems did you find
That’s quite a broad question. We managed to detect old, unsupported pages and errors related to them. We also found a large number of duplicated pages and pages with 404 response code.

How quickly did you implement the required changes after the crawl?
We are still implementing them because the website is large and there are lots of errors on it. There are currently 4 teams working on the website. In view of this fact we have to assign each particular error to each particular team and draw up individual statements of work.

And what were they?
It’s quite difficult to measure results right now because we constantly work on the website and make changes. But a higher scan frequency by bots would mean the changes are productive. However, around one and a half months ago we enabled to index all the paginated pages and this has already affected our statistics.

Having seen the crawl report, what was the most surprising thing you found? (Were there errors you’d never thought you’d find?)
I was surprised to find so many old, unsupported pages which are out of the website’s structure. There were also a large number of 404 pages. We are really glad we’ve managed to get a breakdown of the website subdirectories. Thus we’ve made a decision which team we will start working with in the beginning.

You have worked with different crawlers. Can you compare JetOctopus with the others and assess it?
Every crawler looks for errors and finds them. The main point is the balance between the scanned pages and the price. JetOctopus is one of the most affordable crawlers.

Would you recommend JetOctopus to your friends?
We’re going to use it within our company from time to time. I would recommend the crawler to my friends if they were SEO optimizers.

Your suggestions for JetOctopus.
To refine the web version ASAP. There are a few things we were missing badly:
Thank you very much for such a detailed analysis. Currently we have been reflecting on a redirects' problem.
ultra.by
Do you wan’t
more SEO traffic?
All sites have technical errors which block your SEO traffic boost
I’ts an axiom
Find out the most epic errors ot your site and start fixing them
The amount of the provided information is quite enough and you can see at once where and what the problems are
auto.ria.com