How we identified the reasons for decrease and how we improved the situation
- A medical portal
- 700K pages
- 2.7 mln. monthly visits
The challenge was to find out the exact reasons of dramatic 40% SEO decrease:
WHAT WAS DONE
- Crawling of the whole website
- Gathering of log files for 30 days
- Log file analysis
- Crawling and logs data superimposition
RESULTS OF LIKAR.INFO
- We’ve detected a great amount of duplicate content.
- Also we’ve found a huge problem with Googlebot crawling. 30-35% of crawling budget was wasted on 2 URLs just because of broken pages.
- A lot of pages with GET-parameters were crawled by bot and with the help of JetOctopus we found them.
- Out of logs analysis we identified and cleaned a lot of trash pages which led to SEO decrease.
DMITRY ZHURAKOVSKY (SEO OF LIKAR.INFO)
It’s too early to talk about concrete results, but I’m sure that changes will be fruitful soon.
Desktop tools are further behind in speed, usability, uploaded services than JetOctopus.
At present, JetOctopus is the unique crawler that could scan all webpages really fast. Usability of reports, it’s really easy to work with crawling results.
Reports are formed really fast, a variety of useful options. I will definitely recommend JetOctopus to my SEO friends.
I give JetOctopus 9 out of 10 because I’m sure you can make your tool even more better, but now your crawler is cool.
Get more useful info:Case study. Preply.com. 7% increase of indexability.
ABOUT THE AUTHOR
Serge Bezborodov is a CTO of JetOctopus. He is a professional programmer with 9+ years of experience. He has worked with aggregators for 5 years - vacancies, real estate, cars. Also, he has experience in design DB architecture and query optimization. Serge has crawled more than 160 mln pages, 40 TB of data with JetOctopus.