Recently, Twitter.com faced a significant decline of 30% in their SEO traffic, as estimated by Sistrix.  Twitter no longer ranks in Google for many prominent keywords, including “Trump”, “Yankees”, or “Jennifer Lopez”. 

 The potential cause behind this issue is rather straightforward. 

The Cause: Restricting Googlebot’s Activity

Twitter’s CEO, Elon Musk, made an announcement expressing their intention to limit bot activity on the platform.

 However, it appears that this limitation has unintentionally affected Googlebot as well. Unless Twitter addresses this technical problem, the consequences could persist.

Supporting Evidence:

Although we lack complete certainty without access to Twitter’s internal data, several indicators strongly suggest that Twitter is blocking Google and this has had a lot of negative consequences, including traffic decline! 

  1. Decrease in Indexed Pages:

Barry Schwartz observed a substantial reduction in the number of indexed pages associated with Twitter.

Estimated number of indexed pages BEFORE Twitter policy change Number of indexed pages AFTER Twitter policy change Drop
471M 188M 62%

b. Visibility Drop:

Steve Paine from Sistrix noticed a significant 30% drop in visibility for Twitter.com (in the US market).

c. Personal Investigation:

In my own investigation, I utilized a rendering proxy to assess how Twitter pages are presented to Google. The results indicated that Google’s bots are being blocked.

Findings of my personal investigation:

During my investigation, I made a screenshot of how Google’s URL Inspection tool renders certain Twitter posts. 

And it shows Googlebot is blocked. 

Although this evidence is not entirely conclusive due to recent changes in the user-agent of the URL Inspection tool, it provides valuable insights into the situation.

All this evidence speaks to the fact that Twitter is blocking Googlebot and this is having a negative impact on their SEO traffic. 

P.S. blocking Googlebot is not the only SEO problem Twitter has. During my personal investigation I also noticed JavaScript SEO sabotages Twitter’s SEO traffic. I will explain it in one of my upcoming articles, so stay tuned! 

Avoiding Blame

Rather than assigning blame to specific individuals, it is important to recognize that decisions within Twitter, including layoffs and major changes are primarily made by the CEO Elon Musk. 

Consequently, it is conceivable that this decision was implemented without consulting Twitter’s SEO team.

Proposed solutions:

While it is understandable that Twitter wishes to limit the number of bot-driven interactions, as they consume valuable traffic, disrupt analytics, and scrape content, it is crucial to exempt Googlebot and Bingbot from these restrictions. This can be accomplished through development adjustments.

  1. Exclusions for Googlebot

Twitter’s developers can identify Googlebot either through user-agent or by utilizing a list of IP ranges provided by Google. By whitelisting Googlebot, Twitter can ensure that their pages remain accessible to search engine crawlers without compromising their desired limitations.

However, in some areas this can be considered cloaking; to mitigate risk you can add structured data, as explained below. 

  1. Twitter can add structured data for subscription and paywalled content. 

More information can be found in the Google’s documentation. 

Conclusion:

Firstly, by implementing the proposed solutions, Twitter can strike a balance between restricting bot activity and maintaining visibility in search engines. This change should be relatively straightforward to implement and would benefit both Twitter as a company and its users.

Secondly, this is an open message for website owners, CEOs – whenever you implement huge changes like this – restricting robots – always consult the SEO team!