What actually happened in March 2026#
In the first week of March, Google rolled out a core algorithm update. That part is routine. What was not routine was the specific pattern of who lost traffic.
AI tool directories, comparison sites, and "best of" roundup pages took the biggest hits. Not all of them, but a very specific slice: the ones built on templated content produced fast, without clear author attribution, with affiliate links in every paragraph, and with category pages that were obviously generated from a data feed.
I am not saying this from schadenfreude. aitoolradar.io, the site you are reading this on, is itself an AI tool directory. We also took hits, although smaller than most. The difference was partly luck and partly the deliberate choices we made early on, and it is worth explaining which is which.
Why this update hit directories specifically#
Google's own post-mortem on the update talked about "low-effort content produced at scale" and "pages that exist primarily for monetisation rather than user value". That language is vague on purpose, but the pattern of who got hit reveals what they actually meant.
The sites that lost the most traffic shared three or four of these traits:
- Content written by models with minimal human editing, evident from the generic phrasing
- No visible author, or fake author profiles with stock photo headshots
- Every category page templated from the same database with no editorial layer
- Affiliate links in every paragraph, often with the exact same call-to-action box repeated
- Thin "comparison" content that reordered the same facts across dozens of URLs
That last pattern is the most telling. If your site has 500 pages like "X vs Y", "X alternatives", "best X for Y", and they are all templated from the same data, you were in the target zone.
AI tool directories happen to be a near-perfect match for this pattern. The market was built on it. Pick a tool, scrape its marketing site, produce a review, produce five alternative comparisons, slap affiliate links on every mention, repeat across 200 tools. That is the default playbook.
Why aitoolradar took less damage#
I do not want to pretend this site is special. It is not. We run the same basic directory model. But a few choices at the start mattered more than they looked at the time.
First, every guide on the site has a visible author (Roland Hentschel, in this case me), a photo, a bio, and a link to my actual credentials. Not because we were gaming E-E-A-T, but because I actually write them. When the site launched I was the only writer. That has stayed consistent even as the catalog grew.
Second, each guide lists its "verifiedSources" in the frontmatter and the text includes a "Last verified" date. That is visible to the reader and to Google. It signals that a human went and checked the pricing page on a specific date, rather than pulling it from a scraped data feed.
Third, we stated our affiliate relationships openly on every page. Affiliate disclosure, impressum, datenschutz. Boring corporate hygiene, but it is exactly what Google's quality raters are looking for when they check whether a site is operating transparently.
Fourth, and this is the big one, we limited affiliate links to a couple per page. Not one per paragraph, not a banner in every section. The economic incentive is to load a page with affiliate CTAs, because it maximises per-visitor revenue. We chose not to, because it makes the content worse, and Google now obviously agrees.
None of that makes us immune. We still took a hit on some category pages that were too thin (700-800 words for a whole category), which is one of the things Phase 2 of the site roadmap is addressing. But the hit was maybe 15-20 percent of directory traffic, not the 60-80 percent I saw on other sites in the category.
The harder lesson#
The March 2026 update is not a one-off event. It is the continuation of a three-year trend that everyone in SEO has been watching and most people have been ignoring.
The trend is: Google is systematically penalising content whose existence is primarily economic, not informational. That means:
- Affiliate pages that exist to capture commission, not to inform
- Comparison pages generated from templates, not from analysis
- Category pages stuffed with thin summaries, not original thinking
- AI-generated articles that regurgitate what other articles said
The economic incentive in content used to be "write the thing that ranks", which was often the same as "write the thing that a model would produce". That stopped being true around late 2025. Now the thing that ranks is the thing a model would not produce, at least not without a lot of human direction.
What this means for content strategy#
If you publish content of any kind, the practical implications are clearer now than they were a year ago.
Expertise is a moat, not a checkbox. Google can tell (imperfectly, but well enough) when a writer actually knows a domain versus when they are summarising what others wrote. Investing in expert contributors, or building expertise yourself, is the most durable SEO strategy in 2026.
Original data beats aggregated data. If your page includes numbers that only you could have produced (a test you ran, a survey you conducted, invoices you analysed), you have something nobody else does. That is true for SEO and true for human readers.
Opinion beats summary. Pages that take a clear position with reasoning, even a controversial one, rank better than pages that list options without evaluating them. Being hedgy and comprehensive is no longer neutral; it is penalised.
Consistency beats volume. One new deeply-reported post per week outperforms five shallow posts per week now. The old playbook of flooding the SERP with content is dead.
What I would change if I started over#
A few things I would do differently if I launched aitoolradar.io today instead of a year ago.
-
Start with 20 tool guides, not 200. Focus on depth. Write guides that could only come from extended hands-on use, not from scraping marketing sites.
-
Limit affiliate links to one per page, at most. Revenue per page drops, but retention and ranking improve. The long-term math is better.
-
Publish fewer comparison pages, with more distinctive analysis. Ten "X vs Y" articles that take real positions beat 100 templated ones.
-
Run more original research. Benchmark tests, pricing analyses, survey results. This is the content competitors cannot copy.
-
Invest in the about page, author pages, and methodology page from day one. These are not marketing fluff; they are the signal layer that tells Google (and readers) whether you are operating in good faith.
The optimistic read#
Directory and comparison sites are not going away. They serve a real user need: cutting through the marketing noise to find what actually works. What is going away is the lazy version of that model.
The sites that survive this transition will be the ones that invested in the expensive parts: real expertise, verified sources, clear author identity, restrained monetisation, original research. The sites that do not survive will be the ones that optimised for cheap content at scale.
That is a harder business to run. It is also a much better one to read. As someone whose job depends on Google traffic and who also wants to read content that is worth reading, I am fine with that trade.
If you want more on how to build content that holds up, the pattern we follow at this site is written up in our CONTENT-GUIDELINES (public on GitHub) and referenced in most of our guides. For a broader overview of AI content tools specifically, see our writing tools category.
