Recently, Twitter.com faced a significant decline of 30% in their SEO traffic, as estimated by Sistrix. Twitter no longer ranks in Google for many prominent keywords, including “Trump”, “Yankees”, or “Jennifer Lopez”.
The potential cause behind this issue is rather straightforward.
The Cause: Restricting Googlebot’s Activity
Twitter’s CEO, Elon Musk, made an announcement expressing their intention to limit bot activity on the platform.
However, it appears that this limitation has unintentionally affected Googlebot as well. Unless Twitter addresses this technical problem, the consequences could persist.
Although we lack complete certainty without access to Twitter’s internal data, several indicators strongly suggest that Twitter is blocking Google and this has had a lot of negative consequences, including traffic decline!
- Decrease in Indexed Pages:
Barry Schwartz observed a substantial reduction in the number of indexed pages associated with Twitter.
|Estimated number of indexed pages BEFORE Twitter policy change||Number of indexed pages AFTER Twitter policy change||Drop|
b. Visibility Drop:
c. Personal Investigation:
In my own investigation, I utilized a rendering proxy to assess how Twitter pages are presented to Google. The results indicated that Google’s bots are being blocked.
Findings of my personal investigation:
During my investigation, I made a screenshot of how Google’s URL Inspection tool renders certain Twitter posts.
And it shows Googlebot is blocked.
Although this evidence is not entirely conclusive due to recent changes in the user-agent of the URL Inspection tool, it provides valuable insights into the situation.
All this evidence speaks to the fact that Twitter is blocking Googlebot and this is having a negative impact on their SEO traffic.
Rather than assigning blame to specific individuals, it is important to recognize that decisions within Twitter, including layoffs and major changes are primarily made by the CEO Elon Musk.
Consequently, it is conceivable that this decision was implemented without consulting Twitter’s SEO team.
While it is understandable that Twitter wishes to limit the number of bot-driven interactions, as they consume valuable traffic, disrupt analytics, and scrape content, it is crucial to exempt Googlebot and Bingbot from these restrictions. This can be accomplished through development adjustments.
- Exclusions for Googlebot
Twitter’s developers can identify Googlebot either through user-agent or by utilizing a list of IP ranges provided by Google. By whitelisting Googlebot, Twitter can ensure that their pages remain accessible to search engine crawlers without compromising their desired limitations.
However, in some areas this can be considered cloaking; to mitigate risk you can add structured data, as explained below.
- Twitter can add structured data for subscription and paywalled content.
More information can be found in the Google’s documentation.
Firstly, by implementing the proposed solutions, Twitter can strike a balance between restricting bot activity and maintaining visibility in search engines. This change should be relatively straightforward to implement and would benefit both Twitter as a company and its users.
Secondly, this is an open message for website owners, CEOs – whenever you implement huge changes like this – restricting robots – always consult the SEO team!