A few years ago, I noticed a significant change in the SEO visibility of Giphy.com, a widely-used meme website, during one of Google’s core updates. Giphy experienced a drastic 90% drop in SEO visibility, which was a clear indication that drops in traffic during core updates might be closely tied to these updates.

In this article, I’ll share some intriguing cases of traffic loss linked to indexing issues, focusing on the recent October Spam Update.

Case Study 1: Complete Recovery Post-Update

The first case is particularly noteworthy. During the October Spam Update, Google deindexed 22% of this website’s pages on October 10th, leading to a 60% drop in SEO traffic. 

However, as you can see on the chart above, by the end of October, a rollback occurred. The site’s indexed pages and traffic returned to their original levels. 

Interestingly, the site then began to receive even more traffic than before. This raises the question: Was the October Spam Update too harsh on this website?

Case Study 2: Significant Traffic Drop = Significant Indexing drop

Another website experienced a 35% decline in traffic around the same time in October. Coinciding with this, Google deindexed 50% of its pages on October 14th. 

Was this just a coincidence? I don’t think so 😀

Case Study 3: The Most Intriguing Scenario

The third case is perhaps the most fascinating. On October 7th, we observed a simultaneous drop in both traffic and indexing. 

What sets this case apart? Shortly after the main traffic decline, Google deindexed numerous pages due to duplicate content issues. 

However, this wasn’t your typical duplicate content problem. 

Google mistakenly identified many product and category pages as duplicates of the homepage, which was clearly inaccurate and sonded like a Google bug. 

This wasn’t an isolated incident. Last year, I noticed four similar cases and wrote about them in an article Google’s duplicate detection algorithm is broken.

Gary Illyes from Google shortly later suggested on LinkedIn that such duplicate content issues might stem from Google’s challenges in rendering JavaScript content.

Now, a year later, we have gained more insights. In this particular case, the website was JavaScript-driven, leading me to believe that Google’s difficulty in rendering JavaScript is causing the problem. With JavaScript disabled, these pages appeared very similar, which likely triggered the duplicate content flag.

If you’re facing similar issues with duplicate content, I recommend checking a sample of affected pages to see if they are indeed similar. If not, investigate potential JavaScript SEO issues.

Key Takeaways and Recommendations

In addition to these cases, I’ve found several other instances where Google deindexed numerous pages during recent core updates, resulting in significant traffic drops. 

Whenever I notice a drop in traffic, I review every report in the Google Search Console to look for clues about what happened.

I particularly focus on indexing reports to determine if the traffic drop could be related to indexing issues. This includes overall indexing trends and specific issues like ‘Crawled – currently not indexed,’ which might indicate deindexing by Google.