
With Google AI Overviews, ChatGPT, and Perplexity dominating how people find information, traditional Google rankings only tell part of the story. Today, success means being mentioned, referenced, and cited by AI chatbots.
ZipTie, a specialized AI search monitoring tool, has developed metrics to track exactly that – your brand’s visibility across multiple AI-generated responses.
The AI Success Score is ZipTie’s solution for measuring your performance across major AI platforms: AI Overviews, ChatGPT, and Perplexity.
It’s a single, clear number that shows how well your business appears in AI search results – and more importantly, where to focus your efforts for maximum impact.
Instead of developing different signals from different AI tools, ZipTie’s AI Success Score provides unified performance tracking across:
This consolidated view makes it easy to see how your business performs in AI answers.
ZipTie’s AI Success Score evaluates three key elements of AI search performance:
How often do AI tools mention your brand? This metric tracks your direct presence in AI responses, including references to your company, products, or services.
It’s not enough to just be mentioned – context matters. The score tracks whether mentions are positive, negative, or neutral. Positive mentions significantly increase your chances of winning customers.
Citations measure how often AI search engines cite your website as a source. This shows whether your content is seen as authoritative enough to influence AI-generated answers.
Together, these three metrics paint a complete picture of your brand’s AI search success.

ZipTie lets you monitor your performance over time through the Trends tab, where you can track mentions, sentiment, and citations as they evolve.

ZipTie offers flexible data views at different levels:
Project Level: See your overall AI success in a single score – your big-picture performance.
Platform Level: Compare how you perform across Google AI Overviews, ChatGPT, and Perplexity. You might discover you’re strong in Google but weak in ChatGPT—or vice versa.

Category Level: Using ZipTie’s automatic query categorization, identify which topic categories you dominate and which need work.

Query Level: Drill down to individual queries to see exactly where you’re succeeding and where you need to improve.
Once you have your AI Success Score, look for patterns:
Ask diagnostic questions. For example:
These are just starting points. The real value comes from analyzing your data, identifying patterns, and asking: What can I do to improve this score?
ZipTie provides several tools to help you improve:
The journey starts with understanding your AI Success Score, finding patterns in your data, and taking targeted action to improve your visibility where it matters most.
Recently, we had a client with a very unusual problem.
On weekends, something strange kept happening – the website would COMPLETELY lose rankings and traffic. People couldn’t even find it when they searched for it by its name.
As if by the touch of a magic wand the rankings were coming back after the weekends.
But, here’s the fun part – it wasn’t just a one-time thing! The next week would come, and guess what? The same problem all over again. It was like a puzzle that kept repeating itself.
Just like magic, after each weekend’s trouble, the website would go back to its normal spot! It was like a cycle, happening again and again.
We noticed 9 websites facing the same puzzling situation.
So in the case of these 9 websites, the pattern was very clear. On these dates:
Were losing traffic and rankings COMPLETELY.
It’s like a mystery story – 9 websites, the same dates, weekends, and a magical cycle.
So what do we know about the issue so far?
I have one more interesting point that connects all these websites affected by: “Weekend Ranking Mystery” – their domain extensions are similar. But before we go any further, let me show the evidence, in the form of screenshots from 9 aforementioned websites.
Google confirmed this is a bug As a Google spokesperson said to Search Engine Land: “The issue has since been resolved, and the sites should no longer be seeing its effects”.
Below, you will find 9 examples of websites that experienced MASSIVE drops in traffic/rankings on at least 5 out of six dates:
Most of the screenshots come from fellow SEOs and website owners, that’s the reason on why they aren’t “standardized”, but they still illustrate the issue well.

The chart above clearly shows that the average position has been dropping significantly during the weekends. I have learned from the website owner that on these days, the website stopped ranking for any keywords, contributing to the “Weekend Ranking Mystery.”
Here we observe the same fluctuations in average position during the same days.
This chart displays both clicks and impressions. The website was successfully gaining more traffic until November 25th, when it fell into the “Weekend Ranking Mystery”
In the case of this website, we have a chart depicting ranking fluctuations. This website fell into a ranking trap a little bit later than other websites, on December 8th.
This is another website that experienced issues during all the aforementioned dates, from November 25th to the beginning of January.
Same pattern as above. This website has been suffering from ranking fluctuations before, but after November 25th, it went to an extreme.
This website’s ranking was stable for the last couple of months (as seen in the orange line), but then November 25th came, and the website fell into a ranking fluctuation trap.
This chart illustrates the average number of clicks over time. The website wasn’t very popular (before the “Weekend ranking mystery,” it was receiving around 100-150 clicks per day).
Then, regularly, Google started reducing its online visibility during the same dates as in the case of other examples.
This is another interesting case – the website had a stable number of clicks. However, starting from November 25th, during the weekends, the website stopped ranking for brand terms, effectively putting it into the “Ranking fluctuation trap.”
Given that all 9 domains experienced drops in ranking, traffic, and impressions on the same dates, it’s highly unlikely to be a coincidence.
Adding an extra layer to the story, there is one more intriguing coincidence that I will discuss further in the article.
When I looked at these affected domains, all of them used non-standard domain extension.
9 out of 9 examples were non-standard TLDs, such as:
How we can judge these websites?
Those websites aren’t groundbreaking, but they seem to provide solid value in their respective niches.
If Google can wipe off these websites, then it can happen to any business on the web.
I’ve linked these drops in traffic to recent reports about spam websites. Many, including Lily Ray, have observed a significant surge in spam content within Google Search results. This influx is partly due to Google being overwhelmed by AI-generated content. In the age of AI, it’s become much easier for individuals to create automated systems that flood Google with content, creating link connections among themselves, and more.
Google faces the challenge of dealing with these low-quality pages.
Did Google penalize these 9 websites because they were of low quality?
The answer is NO. However, there seems to be a connection, at least in my perspective.
Interestingly, all the affected domains had non-standard TLDs, such as .care and .consulting.
A similar situation occurred in the past, as reported in 2011. Back then, Google targeted various Polish regional domains through automated systems, arguing that these domains were inexpensive and often exploited by spammers.
Google assured that high-quality domains wouldn’t be affected, but in reality, many reputable websites were impacted as well. This could be closely related. Google might have observed that many spam websites use non-standard TLDs, often because they are more affordable than traditional .com domains, which are also already in use.
If Google has an automated system in place, it may inadvertently target other websites that are generally acceptable.
So, why only on weekends? Shutting down websites entirely would be a massive threat.
Google might have identified a loophole in the system – websites bombarding Google with automated spam. They can easily track spammy websites ranking high during weekdays and react accordingly.
However, on weekends, it appears they prefer to take down domains perceived as “risky”, and be more on a “safe side” cutting more domains than necessary, while ensuring their search results are not flooded by spam.
Of course, this is just a theory, but it seems like a plausible explanation. That’s why I’ve coined this as the “Google Weekend Ranking Mystery.” as this is the mystery that have to by solved by us, SEOs, and Google representatives.
You can never be sure. However, there is a very strong pattern: 9 domains happen to experience a traffic drop at the same time. It’s not a “traditional” penalty, as the website comes back, then loses ranking again in a “vicious circle” during very predictable timeframes.
No, the pattern is clear. These websites stopped ranking even for their brands. They lost their position completely. It cannot be explained by seasonality, with one exception. Drops around the date of Christmas Eve can be explained by seasonality.
Send it my way. I will add it to a common document where I document similar cases. Write a short summary with dates of drops. Additionally, let me know if you see any anomalies in the links report in GSC (such as very few links reported).
First and foremost, ensure it’s not related to any of the following possibilities:
This is something I explain in the upcoming Google Search Console Mastery Course.
Once you ensure that there are no basic SEO mistakes:
Considering there are strong patterns of affected domains (they TOTALLY lose rankings during specifc dates, plus ALL affected domains are not standard TLDs), I consider it a Google bug or Google automated systems being too lax.
If you’re a Googler reading this article, feel free to reach out to me via Twitter or LinkedIn; I have collected all the domains in a single document for more convenience.
If you are experiencing a similar pattern, reach out to me, and I will add your website to my document and hopefully share it with Googlers.
I would love to hear your thoughts about the “Weekend Ranking Mystery.” I hope to solve it fully, and if it’s a Google bug, get the official confirmation from Google. For many websites solely reliant on traffic from Google to drive their business, this is the level of transparency we should expect from Google.
P.S. Kudos to all the people who helped me analyze the patterns, especially for my coworkers: Marcin and Paulina
Alice for noticing the problem and collecting the initial list of websites that dropped.
Pawel Gontarek, for telling me there are more examples like this shared on the Google group.
Tom Slaiter – for sharing his insights.
And obviously – to all the people who decided to publicly share their case.
P.S2: Soon, ZipTie will be integrated with Google Search Console so you will be able to notice traffic drops much easier!
Google confirmed this is a bug As a Google spokesperson said to Search Engine Land: “The issue has since been resolved, and the sites should no longer be seeing its effects”.
Technical SEO is a broad niche that continues to evolve. To help you easily navigate through technical SEO challenges, I’m sharing a crucial part of my personal toolbox in this article. Many online lists highlight Ahrefs, SEMrush, and other popular tools.
Assuming you already know them, I’ll focus on less common choices.
Before I start analyzing a specific website, I check the technologies used on it. This gives me clues about potential problems and obstacles.
Some CMS or technologies are more limited than others. Some of them use JavaScript extensively, posing a risk to successful indexing and ranking.
Two very handy tools for checking the technological stack of a given website are Wappalyzer and Builtwith. Both have free versions, and you can install their Chrome addon.
Below, you can find an example report provided by Wappalyzer:
Using Wappalyzer Chrome addon we can quickly notice that this particular website uses the Angular JavaScript framework.
At the very beginning, I have to be clear: knowing how to program isn’t mandatory for a technical SEO, but it’s definitely handy.
The most popular programming language among SEOs is Python. Programming skills enable you to:
If you want to learn Python, I’ve created a dedicated chapter on Python SEO in my ChatGPT ebook. My goal was to teach you how to program using ChatGPT even if you don’t have prior programming experience. You can get my FREE ChatGPT ebook here.
Another great tool for technical SEO is Google Search Console. As we know, this tool provides insights into your website’s performance in the Google Search Engine.
But it also offers numerous features that are particularly helpful for technical SEO, including:
Here’s just one example: this website (Disqus.com) is presenting empty content for Googlebot.
This is something one can diagnose using Google Search Console.
In my Google Search Console course (coming soon!), I teach how to utilize GSC from a beginner level to a mid-level SEO, covering essential aspects such as crawl budget analysis and more, without skipping any important details.
Looker Studio (formerly Google Data Studio) is a versatile tool for data visualization, widely recognized and utilized in the SEO community.
Many SEO companies leverage Looker Studio to present dashboards to their clients. Others use it to visualize traffic or business data in a better way.
Below, I present a selection of useful Looker Studio templates:
I believe every input from SEO crawler is a must-have for every technical SEO.
Many crawlers, among their basic features, offer numerous functions helpful for technical SEO:
Server log analysis is crucial for technical SEO. Large websites often face issues with Google not indexing some pages. Analyzing server logs helps track Google’s frequent page visits and allows checking when Googlebot exhausts its budget.
Some websites use solutions like Grafana or AWS for logs, which typically offer near real-time logfile analysis and useful visualization. Once you learn one solution of this kind, others will be very similar and intuitive to you.
If you don’t use solutions such as Grafana or AWS logs, ask your developer to send you a logfile. Tools like ScreamingFrog Log Analyzer, Splunk, or custom solutions made with Python or Knime will help you analyze server logs more effectively.
Technical SEOs also focus on website speed, which is important for three reasons:
The most common tool for this is PageSpeed Insights. I appreciate this tool because it provides useful insights into how to enhance your website’s performance. Below are some recommendations for UsainBolt.com (the website of the fastest man):
In terms of web performance, I also enjoy working with the Performance tab in Chrome; it’s a very powerful tool.
As an additional note, many large companies use more advanced tools, such as New Relic. I find working with them rewarding due to their precision, surpassing other front-end tools. With a well-organized system, I can even identify which database queries consume excessive resources.
Knime may not be the most graphically appealing tool, but, on the other hand, it’s an extremely powerful one and becomes quite easy to use after overcoming the initial learning curve.
It’s like a fishing rod, not a fish. The way it works is that you build a workflow by adding what are known as “bricks.” This was well explained in Paul Shapiro’s presentation.
Here are some examples:
JavaScript remains problematic for SEO, particularly for large websites, as it can lead to ranking and crawling issues.
Typically, I examine how JavaScript changes content. To do this, I use one of two tools: WWJD (What Would JavaScript Do) – a tool developed by Onely, and Quick JavaScript Switcher.
Below, you will find a sample result from What Would JavaScript Do:
As you can see, in the case of Angular.io when JavaScript is turned off, the website doesn’t present any content.
WWJD can also show you the differences in meta tags:
You will also need a tool for analyzing indexing. For many people, Google Search Console is enough.
To increase the usefulness of GSC for analyzing indexing, I recommend you create separate GSC properties. Let’s say you have an e-commerce store – you can add three GSC properties: one for blog posts, a second for product pages, and the third for product category pages. It will make SEO analyses much easier.
If GSC is not enough for you, use ZipTie.dev. Some of the features:
Shortly we will integrate with GSC to provide even more useful insights, so stay tuned!
You can try ZipTie for free for 14 days.
A VPN is another crucial tool for technical SEO, as it allows you to “change” your IP address.
This is useful in two scenarios:
The two most popular VPNs are NordVPN (which usually offers a 3-year “evergreen” promo) and ExpressVPN.
This tool — Google.com (I’m sure you know it 😉)—is both obvious and surprisingly useful for technical SEO analyses.
For instance, it helps determine if low-quality pages are indexed. If I notice that my website has search pages, I might want to check if Google has indexed pages with “0 results found.” In case you wonder why it’s important, you will find the answer in the article titled “Hidden Risk of Indexing Low-Quality Content”.
To check if Google has indexed search pages with “0 results found,” I can simply type into Google: “0 results found site:example.com.”
On top of that, I use Google to check if a website suffers from the SafeSearch filter.
Hint: most websites don’t, but it’s always useful to double-check. For instance, some pages of Vimeo aren’t shown for users with the SafeSearch filter on because of NSFW spam in the comment section.
The release of ChatGPT was a revolution in the SEO world. ChatGPT can assist you with various technical SEO tasks:
Basically, it can enable you to perform SEO analyses that wouldn’t be possible without spending tens of thousands of dollars.
In my FREE ebook, I have presented over 70+ ChatGPT prompts manually crafted by me.
There is no specific way of using ChatGPT—the sky is the limit. For instance, I found out that researchers were able to start ranking for AI search engines using four simple techniques. I was curious if we could do the same for Google’s SGE. The next day, I wrote a chatbot that judges my content, gives me tips and recommendations, as well as shows potential rewrites.
All that, using just basic skills—knowing how to talk with ChatGPT.
Of course, throughout your career, you will use many other technical SEO tools. For instance:
Of course, it’s not an exhaustive list of tools useful for Technical SEO.
But I believe it’s helpful to show you what tools are most important.
There are a lot of specialized tools that don’t enter the mainstream market.
You just need to know what you’re looking for. For instance, if you work a lot with JavaScript websites, you would like to find a tool that will easily switch off service workers. Then you Google it. And voilà!.
You have to just know what you’re looking for.
Another good option is to ask on social media – do people know a specific tool that can do X.
If you have some interesting tools to share, send them my way via Twitter or LinkedIn. If I like the tool, I will happily include it in my list.
?>