SEO Funnel (January 2024)
consolidating report – the best start of 2024 for all our clients. It helps you understand where the main problem is.
📎 SEO funnel reflects the state of things of a website on the main stages of organic traffic growth (The page was created on a site → It has to be visited by Googlebot →Then to be ranked→ It starts bringing organic traffic).
📎 SEO funnel lets you understand the most critical works that you should start with to get positive dynamics in SEO.
📎 Now you will know what pages are “dead” in terms of SEO and what to change so they get ranked.
It all saves you a loooooot of time to understand it. And the most important – you will spend SEO resources on impactful optimizations that will definitely bring you new organic traffic, not just SEO works for SEO works.
How-to video 👉 https://lnkd.in/ed3ZNP3R
Impressions (January 2024)
A big inspiration to the SEO community!
With the help of this tool, users can analyze on-page SEO efficiency by Impressions as the main criteria of analysis.
Another inspirational tool that leads to pure satisfaction but not frustration. SEO data can be visualised and informative!
📍The pages In site structure Getting impressions
📍The pages NOT In site structure Getting impressions
📍The Impressions depending on the Page depth
📍The Impressions depending on the Content size
📍The Impressions depending on the # of Internal links
📍The Impressions depending on the loading time
Additional vector to analyze content length, load time, efficiency by countries, page depth, etc.
You can play with any SEO metrics endlessly, getting valuable insights on a fly🌪️
Look at on-page SEO metrics and How they impact on:
– Impressions
– Googlebot visits
– Rankings
– Clicks
If Is interlinking is important or does the number of words on a page have any impact or load time?
You know that JetOctopus.com was initially built for the SEO needs of big websites and we constantly add more and more tools to increase efficiency of e-commerce websites.
We are proud to present new outstanding tools to analyze SEO efficiency of products. The best insights for e-commerce that can dramatically increase SEO traffic.
Get the preset, detailed information about Products by categories:
– indexation management of products,
– products being crawled by Googlebot, NOT crawled,
– products ranked in SERP/not ranked,
– products ranked in TOP 10, in TOP 3,
– the average position of ranked products,
– the products bringing organic traffic.
Usually, you don’t have time to gather such detailed information, but if you work with E-commerce, you should work with this data regularly, trying to rank as many product pages as possible as they generate sales. Now it is all preset for you!
How-to video here 👉 https://www.youtube.com/watch?v=kMSLwGMC4Rc
Media websites, publishers, and the ones who care about Quick article indexation, about the long Lifecycle of the article to make the News work super effectively, grab the tools.
Inside you will work with:
– Crawl gap by Googlebot after the article is published
– Lifetime cycle of the article (impressions, clicks)
– The SEO efficiency by categories (Googlebot visits, impressions, clicks, positions, CTR)
– The SEO efficiency by articles (all the same)
– Efficiency by authors from an SEO point of view!
🏆It is outstanding isn’t it?
Watch How-to video https://www.youtube.com/watch?v=wNBrlRXn6M0
The GSC Subfolders Report – provides an in-depth analysis of how Google Search Console (GSC) anonymizes data for your website. You can identify which subfolders generate significant traffic and consider adding them as separate properties in GSC. This approach enables you to access more detailed data and minimize query anonymization, enhancing the ability to monitor and optimize your site’s performance effectively.
We’ve introduced new crawl options, including more precise controls for crawl speed and timeout settings, advanced JavaScript rendering configurations, and additional customization features to better tailor crawls to your specific website needs.
When you work with big websites the first needed approach is to be able to slice the data into logical parts. And then work with each segment/directory/category separately analyzing the efficiency, making experiments, monitoring the dynamics.
You already know our Site Structure Efficiency tool which has already become a favourite among our users. Now we’ve upgraded it. There are now Site Structure Efficiency data divided into logical parts:
– By directories
– By segments
– By categories
How-to-video will eliminate all the power of the tool.
We’ve enhanced account security by introducing the option to use Google Authenticator for an extra layer of protection.
Structured Data Extraction in URL Explorer – Dive deep into URL-level investigations with structured data extraction now integrated directly into the URL Explorer for more detailed insights.
Dive deep into URL-level investigations with structured data extraction now integrated directly into the URL Explorer for more detailed insights.
One of the most anticipated features in JetOctopus! You can now leverage your existing XPath rules from other tools for custom extraction, streamlining your data collection processes.
Unlock limitless data extraction capabilities with the option to write custom JavaScript code executed against the fully rendered page in a browser. This is an invaluable feature when traditional methods like XPath or CSS rules fall short.
We are glad that finally, this task in our roadmap is live – Manage’s Dashboard. There are a ton of reports with SEO-specific insights and opportunities. SEOs love working with all that. But we all know that our Managers are also the clients of the tools.
Now the big picture of SEO is available to analyze by a manager at SEO Trend’s section:
🔖 Traffic trends: total vs SEO
🔖 SERP efficiency details in dynamics
🔖 Crawl Budget dynamics
Watch How-to video and come to test it 👉
One of the most significant updates is robots.txt tracking, which notifies you of sudden changes to your robots.txt file—helping prevent potential traffic disruptions caused by misconfigurations. Our alert system has expanded to help users monitor critical changes in Google News and Google Discovery.
The DataTable is at the heart of JetOctopus where users spend the majority of their time. We’ve made substantial improvements to ensure it operates faster and more efficiently, delivering a seamless user experience.
Robots.txt files monitoring. Always fresh, always at hand. As always:)
Welcome a new section in Crawler – Robots.txt:
📍 all robots.txt files with Status code, File size, Load time.
And New Alerts on Robots.txt:
📍 if the content of your robots.txt has changed
📍 if the status code of robots.txt has changed
📍 if the status code of robot.txt file is not 200
More details in the video
Boring work should be done by algorithms, and strategic SEO decisions should be made by a person. We are super excited to save a ton of your time by introducing Preset Datasets for you.
1. All possible lists of problematic URLs from Crawl data:
2. All you have to know about Googlebot’s behavior at your website as it is your main client at the end of the day:
3. SERP efficiency (GSC data):
This feature is designed specifically for websites driving affiliate traffic and for publishers. JetOctopus crawls external links to check the target page’s status code, title, and other key details. By identifying and fixing broken external links, you can ensure a seamless user experience and maximize affiliate revenue by avoiding the loss of users to non-functioning pages.
As JavaScript-heavy websites remain prevalent, we’ve dedicated significant resources in late 2024 to enhancing our JS crawler. With a next-generation crawler core now deployed, crawl speed has increased by over 40%, ensuring faster, more reliable analysis of JavaScript-based websites.