Average position has been a cornerstone metric in SEO reporting for years. It provides a simple, at-a-glance sense of where a site typically ranks in Google’s search results.
That sense is growing increasingly misleading as Google layers generative AI features, such as AI Overviews and AI Mode, on top of traditional blue link results.
Search Console then counts all these placements under the same metric. This merging of disparate result types means average position can be decreased by low-impact features or artificially boosted by high-visibility but low-traffic placements.
It is time to retire the average position as one of your primary organic key performance indicators (KPIs) and adopt a more nuanced set of metrics that focus on authentic engagement and conversions.
The Changing Landscape Of Search Features
Over the past decade, Google has transformed from a simple list of 10 blue links into a dynamic search results page packed with interactive elements.
Readers now see AI Overviews at the top of the page that generate concise summaries drawn from multiple sources. They also encounter AI Mode, which combines machine-generated insights with standard links.
Further down, they may find knowledge panels that present quick facts and structured data, People Also Ask widgets that prompt deeper exploration, video snippets that surface relevant clips, local packs showing nearby businesses, and image or news carousels that encourage visual browsing.
Each new format alters user behavior and fragments the attention once reserved for blue link results. The result is a declining share of clicks on traditional listings, which makes a simple ranking metric far less meaningful.
Table of Contents
How AI Overviews And AI Mode Inflate Your Average Position
Google Search Console now assigns AI Overviews the same rank position value as the very top link in the organic listings.
If your page features in that AI Overview box at position one and simultaneously ranks at position four in the blue links, your average position will be calculated to around 2.5.
That figure suggests a page one presence, even though most traffic still comes from the standard link at position four.
In older versions of Search Console, rare placements, such as a query slot at position 12 in a People Also Ask box, would drag your average rank down.
Now, those obscure placements are balanced or outweighed by top-heavy AI features.
The overall metric becomes distorted. An average position of two may feel like a genuine page one victory, but it offers little insight into where user clicks land.
Why This Matters For Your SEO Strategy
An inflated average position can mislead stakeholders into believing content is performing better than it is. A marketing dashboard reporting an average rank of 2.3 will create confidence in page one visibility.
Resources may shift away from high-value keywords that sit at positions five to 10 but deliver strong conversion rates.
Teams might pour effort into optimizing for AI Overviews or AI Mode triggers that look impressive in reports yet generate few real visits.
Over time, this misplaced focus undermines return on investment. Budgets skew toward vanity improvements rather than actions that drive tangible engagement, leads, and sales.
If click-through rate and traffic volumes stay flat or decline despite a rising average position, you risk missing warning signs until revenues slip.
Metrics To Focus On Instead
To gain an accurate picture of SEO performance, we must unbundle average position.
Classify your rankings by feature type. Separate blue link placements from AI Overviews, People Also Ask entries, video snippets, local packs, and other rich features.
Generate click-through rates for each segment so you can see where users engage. Measure absolute organic traffic for top queries and compare that with historical baselines.
Analyze time on page to understand content resonance. Most importantly, connect behavior data to conversions or goal completions. This end-to-end view shows whether search visibility translates into business value.
Another way to reduce distortion is to use percentile-based position metrics. The median position or P50 gives the midpoint ranking across all queries. It is not swayed by a few very high or very low positions.
The 90th percentile or P90 shows the position below which 90% of your rankings fall.
Charting P50 and P90 over time highlights trend directions with less noise from outliers.
You can also calculate a trimmed mean by excluding the top and bottom 5% of positions. Any of these approaches will provide a steadier reading of where your pages stand in the SERP landscape.
Putting It Into Practice
First, export your Google Search Console data for the period or keywords you wish to analyze.
Add a feature tag to each query to mark whether it appeared as a blue link, AI Overview, People Also Ask entry, or other rich element/special content result block (SCRB).
Many SEO tools now include feature filters to automate this step.
Once tagging is complete, calculate the click-through rate for each feature type by dividing clicks by impressions for that feature. Compare click-through rates to identify which formats drive engagement and which only inflate visibility.
Total the organic clicks and analyze sessions to determine content that earns sustained visits.
You want to update your dashboards to reflect these new metrics. Replace an overall average position chart with a histogram showing the distribution of rankings by feature.
Include a bar chart of click-through rates for each result type. Display time series graphs of organic sessions and goal completions to link visibility improvements back to conversions. Keep these graphs simple and focused on actionable insights.
For optimization, focus on tactics that boost click-through rate and conversion paths.
For blue link results, refine title tags and meta descriptions to create a stronger call to action.
Use structured data markup so that when your page appears in People Also Ask or as a video snippet, the preview offers more context. Review the content that underpins AI Overviews.
Make sure your page answers core user questions in clear headings and concise paragraphs so the generative model can source accurate summaries.
Where gaps exist between AI Overview content and user needs, create or expand sections to fill them.
Continuously iterate by filtering your data for high-value keywords and checking whether the AI features you trigger align with intent and deliver clicks.
In larger organizations, you may need to educate stakeholders on the limitations of the average position.
Share before and after views of dashboards to show how the metric shifted once AI features entered the mix.
Walk through specific examples, such as a page that jumped from position five to an AI Overview at position one, yet saw no change in traffic.
Demonstrations like these will build consensus around moving to feature-based and engagement metrics that drive tangible business outcomes.
Summary
Generative AI features in Google Search represent a fundamental shift in how search results appear.
Average position once served as a valuable proxy for visibility, and one of the only first-party data sources to give us this proxy. It now obscures more than it reveals.
By breaking performance down by feature type, measuring click-through rates and conversions, and adopting percentile-based ranking metrics, you can cut through the noise.
This richer approach reveals what matters to your users and your bottom line. In the new era of search, a deeper, more actionable analysis will be your key to sustained SEO success.
More Resources:
- How To Write SEO Reports That Get Attention From Your CMO
- Telling Better Stories With SEO Data To Show Business Impact
- The State Of AI In Marketing
Featured Image: Roman Samborskyi/Shutterstock