SEO · 10 min read

What AI Overviews actually did to organic CTR

The data on click-through impact, the methodology behind the numbers, and what to actually do when half your informational traffic disappears.

Format
Article
Updated
Mar 12, 2026
Read time
10 min read

TL;DR

AI Overviews compress organic CTR on informational queries by roughly twenty to thirty-five percent for the first organic result. Commercial and navigational queries are largely untouched. The loss is real but recoverable: shift the informational track from ranking-for-clicks to being-cited-inside-the-answer, instrument citation tracking with DataForSEO and manual prompt panels, and rebuild attribution with UTM hygiene and self-reported source fields.

01

The numbers, with the caveats up front

Across the FPWS tracked client base of forty-three sites in 2026, the first organic result on a query showing an AI Overview takes a median CTR drop of twenty-eight percent compared to the same query before the AI Overview launched. The range across verticals is twenty to thirty-five percent. Result two through five drop more steeply, in the range of forty to sixty percent. Commercial queries without an AI Overview are within two percent of their pre-2024 baseline.

The headline number you see in trade press is usually 'AI Overviews destroyed organic CTR.' The reality is narrower and more useful. The destruction is real on informational queries. Commercial and navigational SERPs are basically unchanged.

Our methodology is straightforward. We pulled Search Console data from forty-three client domains we have tracked continuously since 2023. For each query that now triggers an AI Overview, we compared average position-weighted CTR in the six months before AI Overview rollout to the six months after. We controlled for ranking position changes by only including queries where average position stayed within plus or minus one ranking slot across both windows.

The Search Console CTR field is the same field it has always been. We are not introducing new measurement noise. We are comparing like-to-like, before and after, on the same queries on the same domains.

02

Why the first result drops less than results two through five

When an AI Overview appears, the user's eye lands on the answer block. If the answer satisfies the query, no click happens at all. That is the dominant case for informational intent. If the answer does not fully satisfy, the user scans for the source, which usually means the first organic result or a citation link inside the AI Overview itself.

Result one still gets a respectable share of the residual clicks. Results two through five are below the fold for many users once the AI Overview pushes the SERP down. They lose disproportionately. The math is not 'AI Overviews lose ten clicks evenly across the page.' It is 'AI Overviews concentrate the surviving clicks at position one and at the AI citation links themselves.'

Practical implication: ranking second on an informational query in 2026 is significantly worse than ranking second was in 2022. The drop-off curve is steeper. Position one or AI citation are the two outcomes worth fighting for. Anything else is mostly impressions without clicks.

03

Where the lost clicks go

Three places. First, the AI Overview itself absorbs the click as a satisfied query. The user got the answer, they leave, they do not click anything. This is the largest bucket and explains roughly half of the CTR drop.

Second, citation links inside the AI Overview send a fraction of the clicks to the cited sources. If you are cited, you get a slice of this traffic. If you are not, you get nothing from this bucket. Citation traffic in our data converts at a rate similar to direct organic for the same query, sometimes higher because the user has already been pre-qualified by the AI's framing.

Third, some users bypass the AI Overview entirely and scroll to the organic results out of habit or distrust of the AI answer. This is the bucket the first organic result captures. It is shrinking as user habits adapt to AI Overviews being the default answer surface.

04

What to do about it: the FPWS playbook

The recovery playbook for AI Overview CTR loss has three moves. One: shift informational content production from ranking-for-clicks to being-cited-inside-the-answer, which is AEO work. Two: instrument citation tracking via the DataForSEO ChatGPT scraper plus manual prompt panels across fifty target queries per client. Three: rebuild attribution with UTM discipline, direct-detection heuristics on referrer-less traffic, and a 'how did you hear about us' field on every conversion form.

We do not try to recover the lost informational clicks. They are not coming back as clicks. We recover the value of those queries by being inside the answer instead of being beneath it.

Tactically that means rewriting the top informational pages on the site to be passage-citable: a TL;DR block sized at sixty to eighty words leading every long-form piece, question-form H2s where they fit, self-contained forty to eighty word answer blocks placed within two hundred pixels of every H2, FAQPage schema rendered server side, and llms.txt at the site root naming the entity and pointing to the canonical resources for each query type.

We then instrument the measurement. The DataForSEO ChatGPT scraper runs monthly across the target query panel. We diff this month's citations against last month's. We score citation position inside the AI Overview, sentiment, and which domain the citation links to. We track the same panel by hand in Perplexity and Google AI Overviews because the API coverage is incomplete.

05

Attribution: harder, not impossible

The honest part of the AI Overview era is that attribution got harder. Referrer headers from AI surfaces are inconsistent. Some send a referrer, some send 'direct,' some send the AI surface's own domain, some send a query-string-stripped version of your URL.

We close the gap with three complementary moves. First, every conversion form has a 'how did you hear about us' field with prefilled options including 'ChatGPT,' 'Perplexity,' and 'Google AI Overview.' Self-reported source data is messy but it is the cleanest signal we have for AI-referred conversions.

Second, we run direct-detection heuristics on traffic that hits a deep landing page with no referrer and no UTM. If someone arrives at /resources/llms-txt as their first session with no referrer, the probability they came from an AI surface is high. We tag this traffic as probable-AI in our analytics.

Third, we keep UTM discipline on the surfaces we control: every internal link, every email, every social post is UTM tagged so the residual untagged traffic is the bucket we attribute to AI and direct.

06

What not to do

Do not delete your informational content because the clicks are down. The content is still earning citations and brand exposure inside AI Overviews. Deleting it removes you from the answer.

Do not chase 'AI Overview optimization' as a separate discipline from AEO. It is the same work. The Overview is one of several AI surfaces and the structural moves that earn citations there earn citations elsewhere too.

Do not abandon classical SEO for the queries that still send clicks. Commercial intent did not change. The keyword research, on-page optimization, internal linking, link earning, and Core Web Vitals work that drove rankings in 2022 still drive rankings in 2026 on commercial queries.

Questions

Answered below.

  • On informational queries that trigger an AI Overview, the first organic result loses a median twenty-eight percent of CTR. Results two through five lose forty to sixty percent. Commercial and navigational queries without an AI Overview show no measurable change. The numbers come from the FPWS tracked client base of forty-three sites comparing six months before and after rollout.

Want this work done for you?

Let's talk.