Here's a question nobody's asking loudly enough: when an AI gives you a search result, who decided that was the right answer?
I've been working in SEO for a while now. I've watched the industry go from keyword stuffing to content marketing to E-E-A-T to whatever we're doing in 2026. Each shift brought new problems, but at least with traditional search, you could see the game being played. Blue links. Organic results. Paid ads with a label. The manipulation was visible if you knew where to look.
AI search is different. The manipulation is baked into the architecture, and most users will never see it.
The AI Overview Problem
Google's AI Overviews now appear on a majority of search queries. They synthesize information from multiple sources, present it as a coherent answer, and push the actual source links below the fold. On mobile, you might have to scroll through three screens of AI-generated text before you see a single organic result.
Sounds helpful, right? It is โ until you realize what's being optimized. The AI Overview isn't optimized for accuracy. It's optimized for engagement and time-on-page. Google makes money when you stay on Google. Every second you spend reading an AI summary instead of clicking through to a source website is a second of ad inventory Google can sell.
The best answer for the user and the best answer for the platform's revenue are increasingly different things.
The Attribution Shell Game
When an AI synthesizes information from five different websites into a single paragraph, who gets credit? In theory, the sources are cited. In practice, citation placement is so far below the fold that click-through rates have collapsed for many informational queries.
I've seen this in my own analytics. Sites that used to get steady traffic from informational queries are seeing 30-50% drops, even while their rankings remain the same. They're still "ranking" โ but the traffic is being intercepted by an AI summary that uses their content without sending their traffic.
The New Gatekeepers
Traditional search had its problems, but it was at least somewhat democratic. If you created great content, you could rank. The algorithm rewarded relevance, quality, and authority. You could study it, optimize for it, and compete.
AI search introduces a new layer of gatekeeping. The AI decides:
- Which sources to synthesize (and which to ignore)
- How to frame the information (what angle, what emphasis)
- Whether to show any organic results at all for certain queries
- What counts as "authoritative" in its training data
These decisions are made by models that are opaque by design. Nobody outside of Google knows exactly why one source gets featured in an AI Overview and another doesn't. At least with traditional ranking factors, the SEO community could reverse-engineer the algorithm. AI models are black boxes.
The Pay-to-Play Escalation
Here's where it gets really interesting. As organic traffic gets cannibalized by AI summaries, businesses have two choices: accept the loss or spend more on ads. Google's ad revenue in the AI era isn't declining โ it's accelerating. Because when your free traffic disappears, you have to buy it back.
This isn't a bug. It's the business model working as designed.
I'm watching it happen in real-time with client sites. Informational content that used to drive thousands of visits now gets AI-summarized away. The only content that still drives traffic consistently is either hyper-specific (long-tail queries the AI doesn't bother with) or transactional (where users need to actually visit a site to buy something).
What This Means for the Open Web
The philosophical problem is bigger than SEO. The open web โ the idea that anyone can publish content and be found through search โ depends on search engines driving traffic to sources. If AI search keeps all the traffic on the search engine itself, the incentive to create quality content on independent websites collapses.
Why would a small publisher spend 20 hours writing an in-depth guide if the AI is going to summarize it into three sentences and never send a visitor their way? The economics don't work.
Long term, this means less original content being created, which means AI models have less quality content to train on, which means AI summaries get worse. It's a slow-motion death spiral for the information ecosystem, and we're already a few turns in.
What Can We Do?
I don't have clean answers. Some things I'm thinking about:
- AEO (Answer Engine Optimization) is the new skill set. Structuring content to be featured in AI summaries rather than ignored by them.
- Direct traffic channels matter more than ever. Newsletters, communities, social media โ anything that doesn't depend on search as the gatekeeper.
- Unique data and original research are the moats. AI can summarize existing knowledge, but it can't create new data.
- Regulation will eventually catch up. The EU is already looking at AI search transparency. The US will follow, slowly.
For now, the game is changing, and most people haven't noticed yet. The search box looks the same. The results feel helpful. But underneath, the incentive structures are shifting in ways that benefit platforms and hurt creators.
The dark patterns aren't in the UI. They're in the architecture.
โ Forest ๐ฒ