Inside AI Max, PMax & Smart Bidding: How PPC Managers Regain Control

Inside AI Max, PMax & Smart Bidding: How PPC Managers Regain Control

Google Adds New Performance Max Controls And Reporting Features

Google introduces Performance Max updates, including audience exclusions, budget projections, and expanded reporting to give advertisers more visibility and control over campaign performance.

Brooke Osmundson Brooke Osmundson 2.3K Reads
Google Adds New Performance Max Controls And Reporting Features

Google has announced a new set of updates to its Performance Max campaign type, focused on two areas advertisers have consistently asked for: more control over who campaigns prioritize, and better visibility into where budget is going.

The updates include first-party audience exclusions, budget reporting, expanded audience reporting, and placement reporting segmented by network.

Read on for more updates and what this means for your campaigns.

New First-Party Audience Exclusions

The first update Google announced was framed around more precise steering for your target audience.

Advertisers can now exclude specific first-party customer lists from Performance Max campaigns.

If your goal is acquiring net-new customers, excluding existing customer lists can help reduce wasted spend on people who may have converted anyway. It also creates a cleaner setup for evaluating whether Performance Max is actually contributing incremental value.

That said, this still depends heavily on how clean and current your first-party data is. If your customer match lists are outdated, incomplete, or poorly segmented, this feature won’t solve the problem by itself.

It also does not turn Performance Max into a precision audience campaign. Advertisers should still think of this as directional steering, not rigid targeting.

New Reporting Features Focused On Budget And Audience Visibility

The second part of Google’s update is around different reporting levers.

The first update is around the budget report. Advertisers can now find the budget report directly within a Performance Max campaign to help forecast the end-of-month spend. It can also provide scenarios on how changing the daily budget impacts potential performance.

Google is also expanding audience reporting with more detailed demographic and segment-level performance views, including breakdowns such as age range and gender.

Image credit: Google, March 2026

That should give advertisers more context around who the system is actually reaching, rather than just what overall campaign performance looks like.

The last reporting update announced is around network reports. Advertisers can now segment placement reports by network to show:

  • Where ads have served
  • More visibility to ensure brand safety across all Google-owned channels

The placement report lives under the “When and where ads showed” tab.

Why This Matters For Advertisers

Google has continued on its promise to provide more transparency to advertisers in these automated campaign types. They’re continuing to make Performance Max more useful for marketers trying to manage it more intentionally.

The first-party audience exclusion update gives advertisers a more practical way to support acquisition-focused strategies. Brands trying to reduce overlap between prospecting and retention efforts may find this especially helpful.

The reporting updates will likely have broader day-to-day value.

Budget reporting should make it easier to monitor pacing and explain monthly spend behavior, especially for teams working within strict budget expectations or reporting back to stakeholders.

Expanded audience reporting gives advertisers more context around who campaigns are actually reaching. That matters when conversion volume alone doesn’t tell the full story.

Network segmentation in placement reporting also adds a layer of visibility many advertisers have wanted for a long time, particularly those keeping a close eye on brand safety and placement quality.

Taken together, these updates give advertisers more visibility into how Performance Max is spending and who it’s reaching.

Looking Ahead

This rollout is more useful than groundbreaking, but that does not make it insignificant.

Google continues to fill in some of the operational gaps that have made Performance Max harder to manage than many advertisers would like.

For teams already using it, these updates should make campaign oversight a little easier.

For teams that have been frustrated by limited visibility, this is another step toward making Performance Max more workable in real account management.

Google Is Replacing Dynamic Search Ads With AI Max

Google replaces Dynamic Search Ads with AI Max. Learn what’s changing, when migrations begin, and what advertisers should do before September upgrades.

Brooke Osmundson Brooke Osmundson 3.3K Reads
Google Is Replacing Dynamic Search Ads With AI Max

Google just announced the deprecation of Dynamic Search Ads (DSA) and is officially moving its legacy capabilities into AI Max.

Starting in September, eligible campaigns using Dynamic Search Ads (DSA), automatically created assets (ACA), and campaign-level broad match settings will automatically upgrade to AI Max.

While advertisers have speculated about this change for months, the update is now official.

If you’re running Dynamic Search Ads, automatically created assets (ACA), and/or campaign-level broad match settings, keep reading to understand how your campaigns will be affected.

DSA Features Migrating Into AI Max

Beginning in September, advertisers will no longer be able to create new DSA campaigns through Google Ads, Google Ads Editor, or the Google Ads API. Existing eligible campaigns will be migrated automatically.

Google positions AI Max as the next generation of DSA.

Historically, DSA helped advertisers capture additional search demand beyond their keyword lists by using website content to generate headlines and choose landing pages. That made it useful for large sites, inventory-heavy businesses, and advertisers looking for broader query coverage.

AI Max keeps that concept but adds more signals and controls.

According to Google, AI Max combines advertiser assets, landing page content, and broader intent signals to help match ads to more relevant queries. It also adds controls such as:

  • Brand controls
  • Location controls
  • Text guidelines
  • Search term matching
  • Text customization
  • Final URL expansion
Image credit: Google, April 2026

Google says campaigns using the full AI Max feature suite see an average of 7% more conversions or conversion value at a similar CPA or ROAS compared with using search term matching alone.

Google is also splitting the transition into two phases.

Phase 1: Voluntary Upgrades

Google announced that upgrade tools for existing DSA users are rolling out this week.

DSA advertisers will receive tools to move historical settings and data into new standard ad groups. ACA and campaign-level broad match users may see in-platform prompts to upgrade to AI Max.

Phase 2: Automatic Upgrades

Starting in September, remaining eligible campaigns with legacy settings will be upgraded automatically.

Google says all eligible upgrades are expected to finish by the end of September.

It’s important to note how legacy settings will be automatically migrated over to AI Max settings:

  • DSA users will have all three AI Max features enabled by default (search term matching, text customization, final URL expansion)
  • ACA users will have two AI Max features enabled by default (search term matching and text customization)
  • Campaign-level broad match users will have just search term matching enabled by default

What Advertisers Can Do To Prepare For The AI Max Transition

If you still rely on Dynamic Search Ads, now is the time to review where those campaigns sit in your account and how much value they drive.

Some advertisers use DSA as a core growth lever. Others use it as a low-maintenance catch-all for incremental growth. Your next steps may differ depending on that role.

#1. Review Your DSA Performance Now

Before the automatic upgrades begin, pull recent performance data for your DSA campaigns.

Look at conversions, assisted conversions, search terms, landing pages, and efficiency metrics. That baseline will help you judge whether performance changes after migration are positive, neutral, or negative.

#2. Upgrade On Your Timeline Before Automatic Upgrades

Google is encouraging advertisers to move early, and there is a practical reason for that.

A voluntary upgrade gives you more control over settings, structure, and testing than waiting for an automatic migration.

If DSA is important to your business, it makes sense to evaluate the upgrade before September.

#3. Test AI Max Impact

Google recommends using one-click experiments because they give advertisers a cleaner way to compare performance before making a full rollout decision. While I haven’t tried this yet, I will be testing it myself in the coming months.

Even if AI Max improves results on average, averages do not guarantee results in every account. Lead generation, e-commerce, local services, and B2B advertisers may all see different outcomes.

Run controlled tests where possible and compare against your existing baseline.

#4. Lean Into Additional Controls

Many advertisers asked for more steering options in search automation, and Google has listened to our feedback. AI Max includes more controls than legacy DSA.

Spend time understanding brand settings, location controls, and text guidance. Those inputs may matter as much as the automation itself.

#5. Watch Search Match and Landing Page Quality

Once you’ve migrated your DSAs to AI Max, watch closely for the search terms your campaigns are now matching with. How does it compare to past DSA performance?

You’ll also want to pay attention to the landing pages used (if final URL expansion is turned 0n), lead quality, and conversion paths.

Looking Ahead

Dynamic Search Ads have helped advertisers scale beyond their current keyword lists for years. Now, Google is folding that capability into its broader AI Max framework.

The clearest next step is to review where DSA is still active in your account and decide whether to migrate on your own timeline or wait for the automatic upgrade.

The real focus should be protecting performance during the transition and understanding where AI Max improves results, or where it needs tighter management control.

How To Measure PPC Performance When AI Controls The Auction

Measure PPC performance in 2026 with AI-driven auctions, smarter attribution, profit-based metrics, and reporting frameworks built for Google Ads automation and AI search.

Brooke Osmundson Brooke Osmundson 1.2K Reads
How To Measure PPC Performance When AI Controls The Auction

For most of the history of paid search, performance measurement followed a clear cause-and-effect relationship.

Advertisers controlled the inputs inside their campaigns like bid strategies, keyword and campaign structure, ad copy, and landing pages. All these factors contributed to conversion performance in some shape or form.

When performance changed, the explanation was usually traceable. For example, a new keyword theme improved conversion rates. Or, a bidding strategy increased efficiency.

That simple cause-and-effect framework is breaking down in real time, and has been for a while.

Over the past several months, Google has accelerated its transition toward AI-driven campaign types like Performance Max, Demand Gen, or assets inside those like AI Max or AI-driven ad creative components.

Not only do these change how campaigns are set up and managed, but they also change how performance must be measured.

Advertisers increasingly receive conversions from queries they did not explicitly target, from creative assets that are automatically assembled, and from placements distributed across multiple channels. In this environment, measuring performance by analyzing individual campaign inputs becomes less useful.

The real challenge is understanding how automated systems generate outcomes.

This article provides a measurement framework for that reality. It explains what has changed in advertising platforms, how PPC teams can evaluate performance when automation controls more of the auction, and how practitioners can communicate results clearly to leadership.

The Current Measurement Crisis In PPC

Right now, most discussions about AI in PPC tend to focus on automation features like campaign types, targeting capabilities, ad creative development, and bid strategy expansion.

But, there’s a deeper shift happening in measurement but not talked about as much.

Automation introduces a larger set of variables influencing each auction. When the platforms make targeting, bidding, placement decisions (and more) dynamically, isolating the impact of individual campaign inputs becomes difficult.

Recent platform updates have not only changed how campaigns are managed, but also how performance should be interpreted. The connection between action and outcome is less direct, and in many cases, partially obscured.

Several platform developments illustrate why traditional measurement methods are becoming less reliable.

AI Max Expands Queries Beyond Keyword Lists

In my opinion, AI Max represents Google’s most aggressive step toward intent-driven matching.

Instead of relying solely on advertiser-defined keywords, AI systems evaluate contextual signals, user behavior patterns, and historical performance data to match ads with queries that may not exist in the account.

Not only that, but AI Max goes beyond search terms. It also has the ability to change your ad assets for more tailored messaging when Google deems appropriate.

For PPC managers, this introduces a structural shift in how to measure performance. Conversions may originate from queries that were never explicitly targeted.

And we knew that something like this was coming. Back in 2023, Google first publicly used the word “keywordless” in communications when talking about Search and Performance Max.

Source: Mike Ryan, X.com, March 2026

For example, a retailer who bids on “trail running shoes” may now appear for search terms like:

  • “best shoes for rocky terrain running”
  • “ultra marathon footwear”
  • “durable hiking running hybrids”

These queries reflect the same intent, but they don’t map cleanly back to the original keyword strategy.

Instead of trying to force these queries into keyword-level reporting, try analyzing performance by grouping into intent clusters. By evaluating conversion rate and revenue at the category level, teams can maintain strategic clarity even as query matching expands.

Google Ads already does a decent job of this in the Insights tab within the platform. They have a “Search terms insights” report that groups queries into “Search category,” where you can see conversions and search volume.

Screenshot by author, March 2026

Performance Max Distributes Spend Across Multiple Channels

Performance Max can further complicate measurement by distributing budget across Search, YouTube, Display, Discover, Gmail, and Maps.

Up until last year, there was little-to-no transparency in how spend was allocated across those channels. Back in April 2025, Google launched the long-awaited feature of channel reporting to the PMax campaign type. It now shows channel-level reporting, better search terms data, and expanded asset performance metrics.

For example, say you have a $40,000 monthly PMax campaign budget and see this channel breakdown:

Channel Spend Conversions
Search $18,500 310
YouTube $10,200 82
Display $7,100 45
Discover $4,200 28

If Search drives the majority of conversions, but YouTube consumes a large portion of spend, PPC marketers could try the following:

  • Test separating out branded search outside of PMax.
  • Refine asset groups to improve search alignment.
  • Run controlled experiments comparing PMax vs. Search.

Measurement becomes an exercise in interpreting how the system allocates spend rather than controlling each placement.

Ads Are Beginning To Appear Inside AI Conversations

Conversational search introduces an entirely new layer of complexity into PPC measurement.

Google is now testing shopping results embedded directly within AI Mode, allowing users to compare products without leaving the interface.

Google isn’t the only one doing this. ChatGPT announced on Jan. 16, 2026, that it would begin testing ads for its Free and Go users in the United States.

No matter which platform is running or testing ads in AI conversations, it’s clear that the measurement gap hasn’t been solved, and leaves many PPC managers with unanswered questions.

In my own recent search, I came across ads at the end of an AI Mode thread when I searched “noise cancelling headphones”:

So, if I were to click on one of those sponsored ads but convert at a later time, that attribution is unclear right now. Will my conversion be measured from the AI recommendation, the product listing click, or a later branded search?

These journeys challenge traditional attribution models, which were built around linear click paths rather than multi-step AI interactions.

Why Traditional PPC Metrics Are No Longer Enough

Many PPC reporting dashboards still rely on communicating metrics like impressions, clicks, conversion rate, and return on ad spend.

While some of those metrics remain useful, they no longer tell the full user story when bringing in automated and AI-driven environments.

These three shifts explain why.

1. Attribution Windows Are Expanding

AI-assisted search increases both the length and complexity of user journeys.

Research from Google and Boston Consulting Group show that “4S behaviors” (streaming, scrolling, searching, and shopping) have completely reshaped how users discover and engage with brands.

When AI introduces product recommendations earlier in a user’s journey, the time between initial interaction and conversion often grows. This could be because that user is still at the beginning of their research phase. Just because you’re introducing a product earlier, does not mean that they’ll be ready to purchase it any earlier.

So, what can marketers do about that gap now? Here are a few helpful tips to better understand how users are engaging with your business:

  • Review conversion lag reports in Google Ads.
  • Analyze time-to-conversion in GA4. Are there any differences or shifts in the last three, six, or nine months?
  • Extend attribution windows to 60-90 days where appropriate.

This ensures automated systems receive more accurate feedback on what (and when they) drive conversions.

Organic Search Is Losing Click Share

Search results now include everything from AI Overviews, scrollable shopping modules at the top, and expanded ad placements across all devices.

Where does that leave organic listings?

A study conducted by SparkToro and Datos found that nearly 60% of Google searches end without a click.

This reduces organic traffic even more and shifts more demand capture towards paid media.

From a measurement standpoint, PPC should be evaluated alongside organic performance when possible.

Tracking blended search revenue provides a more accurate view of total search performance, rather than isolating paid channels.

AI Systems Optimize For Outcomes Rather Than Inputs

Traditional PPC management focused on inputs like keywords, bids, and ad copy to influence performance directly.

AI systems work differently. Instead of optimizing individual levers, they evaluate large sets of signals in real-time to determine which combinations are most likely to drive conversions.

This changes what measurement needs to do. Instead of asking which specific keyword or bid strategy adjustment improved performance, marketers need to evaluate whether the platform is producing the right business outcomes.

As platforms take over more of the execution, measurement has to focus less on the mechanics and more on whether automation is driving profitable, meaningful results.

The New Measurement Stack For AI-Driven PPC

If AI is now controlling more of the auction, then PPC teams need a different way to evaluate performance.

The old measurement stack was built around visibility into campaign inputs. You could look at keyword performance, search terms, ad copy, device segmentation, and bid adjustments to understand what was working. That model starts to fall apart when automation is making many of those decisions on your behalf.

The replacement becomes a new measurement stack that advertisers should look at in these four layers:

  • Profitability.
  • Incrementality.
  • Blended acquisition efficiency.
  • First-party conversion quality.

Together, these give marketers a more accurate picture of whether automation is actually helping the business grow.

Start With Profit, Not Just ROAS

ROAS still has value, but it should no longer be treated as the primary success metric in highly automated campaigns.

The problem is that AI-driven systems are often very good at capturing demand that already exists. That can make campaign efficiency look strong on paper, even if the business is not gaining much incremental value.

A campaign with a 700% ROAS may still be underperforming if it is primarily driving low-margin products, repeat purchasers, or orders that would have happened anyway.

That is why profitability should sit at the top of the measurement stack.

Instead of asking, “Did this campaign generate enough revenue?” marketers should be asking, “Did this campaign generate profitable revenue?”

For ecommerce brands, this could mean incorporating:

  • Contribution margin.
  • Product margin by category.
  • Average order profitability.
  • New customer revenue vs. returning customer revenue.

A simple starting point is to compare campaign revenue against both ad spend and cost of goods sold.

For lead gen advertisers, the same principle applies, just different incorporations:

  • Qualified lead rate.
  • Sales acceptance rate.
  • Close rate by campaign.
  • Revenue per opportunity.

If AI is optimizing toward cheap conversions that never turn into revenue, the system is learning the wrong lesson.

Add Incrementality To Separate Demand Capture From Demand Creation

The second layer of the stack is incrementality. This is where many PPC measurement frameworks still fall short.

Automation can be highly effective at finding conversions, but that does not automatically mean it is generating new business. In many cases, AI systems are simply getting better at intercepting users who were already on their way to converting.

If your campaign is mostly capturing existing demand, performance may look strong inside the ad platform while actual business lift remains modest.

This is why incrementality testing has become much more important in the AI era.

For PPC teams, this means at least part of measurement should be designed to answer: “Would this conversion have happened without the ad?”

You don’t need an enterprise-level media mix modeling to get started. A few practical approaches include:

  • Geo holdout tests. Pause or reduce spend in a small set of markets while maintaining normal activity elsewhere.
  • Use Google incrementality testing. Google reduced the minimum of testing incrementality in its platform to just $5,000, making it more affordable for many advertisers.
  • Branded search suppression tests. In select markets or windows, test the impact of reducing branded spend where brand demand is already strong.

Answering this question does not mean automation is bad. It means PPC teams need a better way to distinguish between platform efficiency and true business lift.

Use Blended CAC To Measure Search More Realistically

The third layer of the new measurement stack is blended acquisition efficiency.

As AI Overviews, AI Mode, and other search changes continue to reduce traditional organic click opportunities, PPC should not be measured in a vacuum.

That is especially true for brands where paid and organic search are increasingly working together to capture the same demand.

A campaign may appear less efficient in-platform while still playing a critical role in maintaining total search visibility and revenue.

That is where blended customer acquisition cost (CAC) becomes useful.

Blended CAC looks at total acquisition spend across relevant channels and divides it by the total number of new customers acquired.

The formula for this is simple:

Total acquisition spend ÷ total new customers = blended CAC

This gives leadership a much more realistic picture of what it actually costs to grow the business.

It also helps PPC managers explain why paid search may need to carry more weight when organic search visibility declines due to AI-driven search features.

In other words, this metric helps move the conversation away from “Did Google Ads hit target ROAS?” and toward “What is it costing us to acquire a customer across modern search systems?”

Make First-Party Conversion Quality The Foundation

The final layer of the stack is first-party data quality. This is the part many advertisers still underestimate.

As platforms automate more of the targeting, bidding, and matching logic, the quality of the signals you send back becomes even more important. If the platform is deciding who to show ads to and which conversions to optimize toward, your job is to make sure it is learning from the right outcomes.

That means not all conversions should be treated equally.

If a lead form completion, low-value purchase, repeat customer order, and high-margin new customer sale are all fed back into the system the same way, automation will optimize toward volume, not value.

For PPC teams, that means the measurement stack should include a serious review of conversion quality inputs, including:

  • Offline conversion imports.
  • CRM-based revenue mapping.
  • New vs. returning customer segmentation.
  • Lead quality or opportunity-stage imports.
  • Customer lifetime value indicators where available.

This is where measurement and optimization start to overlap.

If the wrong conversions are being measured, the wrong outcomes will be optimized.

That is why first-party data is not just a reporting issue. It is the foundation of the entire AI-era measurement stack.

What To Show Your CMO Or Clients

One of the most difficult aspects of managing automated campaigns is explaining performance to leadership teams.

Executives often expect reporting frameworks built around the mechanics of traditional campaign management. In automated environments, those indicators tell only a small part of the story.

A more effective reporting structure focuses on three layers that connect advertising performance to business outcomes.

The first layer should always focus on the metrics that leadership teams care about most. Revenue growth, contribution margin, and customer acquisition cost provide a direct connection between marketing activity and company performance. These indicators allow executives to evaluate marketing investments in the same framework they use to evaluate other business decisions.

Instead of presenting keyword-level reports, PPC leaders should begin with a clear summary of how paid media contributed to revenue and profit during the reporting period. If revenue increased by 18% quarter over quarter while customer acquisition costs remained stable, that outcome provides a far more meaningful signal than any individual campaign metric.

The second layer of reporting should explain how paid media contributes to the broader acquisition ecosystem. As AI-driven search experiences reshape the visibility of organic results, paid media often carries a larger share of the responsibility for capturing demand.

Blended customer acquisition cost provides an effective way to communicate this relationship. By combining marketing spend across channels and dividing it by the total number of new customers acquired, organizations gain a clearer understanding of the overall efficiency of their acquisition strategy.

This approach also helps executives understand how paid search interacts with organic search, social advertising, and other marketing channels. Rather than evaluating PPC in isolation, leadership can see how the entire acquisition system performs.

The final layer of reporting should focus on experimentation and strategic insights. Automated systems constantly evolve, and the best way to evaluate them is through structured experimentation.

Reports should include summaries of campaign experiments, including:

  • The hypotheses tested.
  • The metrics evaluated.
  • The outcomes observed.

For example, if enabling AI-driven query expansion increased conversion volume while maintaining acceptable acquisition costs, that result provides valuable guidance for future campaign structure decisions.

Equally important is identifying metrics that are becoming less relevant.

Keyword-level performance reports, average ad position, and manual bid adjustments were once central components of PPC reporting. In automated campaign environments, those metrics often provide little strategic value. Continuing to emphasize them can distract leadership from the outcomes that truly matter.

Effective reporting in the AI era should emphasize growth, profitability, and strategic learning rather than operational mechanics.

Measurement Gaps That Still Exist

Despite improvements in automation and reporting transparency, several emerging advertising experiences remain difficult to measure.

One example is the growing presence of personalized offers within AI-driven shopping experiences. Google’s Direct Offers feature allows retailers to surface dynamic discounts during AI-generated shopping recommendations. While the feature may influence purchase decisions, advertisers currently have limited visibility into how frequently those offers appear or how strongly they influence conversion behavior.

Without that visibility, marketers cannot easily determine whether the discounts are generating incremental revenue or simply reducing margins on purchases that would have occurred anyway.

Another emerging measurement challenge involves conversational commerce. Google has begun exploring “agentic commerce” systems where AI assistants help users research and purchase products across multiple retailers.

In these environments, the user journey may involve several conversational prompts before a purchase occurs. The traditional concept of an ad impression or click may become less meaningful when AI systems guide the user through a multi-step research process.

As these experiences evolve, marketers will need new attribution models capable of evaluating influence across conversational journeys rather than isolated interactions.

These developments highlight the importance of ongoing experimentation and advocacy from advertisers. Measurement frameworks will need to evolve alongside the platforms themselves.

The Future Of PPC Measurement

Automation has changed the mechanics of paid advertising, but it has not eliminated the need for strategic oversight.

If anything, the role of human expertise has become more important.

AI systems are extremely effective at executing campaigns across large datasets and complex auctions. What they cannot do on their own is define the business outcomes that matter most or interpret performance within the broader context of organizational growth.

The most effective PPC teams are adapting to this reality. Instead of focusing exclusively on the mechanics of campaign management, they are investing more effort in defining profitability metrics, designing incrementality tests, and building reporting frameworks that connect advertising performance to business outcomes.

Measurement in the AI era will look different from the measurement frameworks that defined the early years of paid search. The focus will shift away from controlling individual campaign inputs and toward understanding how automated systems generate value for the business.

For PPC practitioners and marketing leaders alike, that shift represents the next stage in the evolution of paid media strategy.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock

How to Feed Google's PPC Algorithm the Intent Signals That Actually Drive Revenue

Align your Google Ads with real buyer intent, not vanity conversions.

CallTrackingMetrics CallTrackingMetrics
How to Feed Google's PPC Algorithm the Intent Signals That Actually Drive Revenue

How First-Party Intent Data Makes Smart Bidding Smarter 

Why do my Google Ads conversions look strong, but my revenue isn’t growing?

How do I know if Smart Bidding is optimizing toward the wrong audience?

Which first-party signals should I feed Google Ads? 

In most cases, the problem isn’t your bidding strategy; it’s that Google is optimizing toward the wrong conversions.

The fix is already in your business: the calls, conversations, and closed deals you generate every day contain the intent signals Smart Bidding needs to find the right people. The question is how to extract them and feed them back in. 

First, you need to make sure that your Google Ads conversion actions are not simply set to default.

How To Audit Your Google Ads Conversion Actions To Ensure Intent Is Aligned With Real Goals

By default, Google Ads tracks a range of events as conversions. For many businesses, some of those conversion goals don’t align to revenue, causing massive issues with reporting on conversions and conversion value. 

This will help you identify where your AI-driven Google Ads campaigns are miscalculating conversions.

Step 1: Audit Your Conversion Actions

  1. List every active conversion in your Google Ads account.
  2. Sort each into one of two buckets: tied to revenue, or tied to engagement.
  3. Move engagement signals to secondary so they’re visible in reporting but not driving bids.

Step 2: Add Qualified Inbound Calls As A Primary Conversion

  1. Set a call qualification threshold. Duration can be used as a baseline in Ads, or use a call tracking vendor to include other signals that could be more impactful. Tags, scores, or other insights like spoken words extracted from conversations are where you will find true quality in your calls. 
  2. Connect your lead data to Google Ads via offline conversion imports or Enhanced Conversions for Leads.
  3. Give Smart Bidding four to six weeks to recalibrate before evaluating.

Step 3, Ongoing: Run A Quarterly Lead Intelligence Review

  1. Pull the top five objections from call recordings, by campaign source.
  2. Update your keyword lists and negatives accordingly.
  3. Brief your creative team on the language your best leads are using.

What You’ll Likely Find: Google Is Optimizing Toward the Wrong People

According to ALM Corp’s State of PPC 2026 research, 53% of PPC professionals say their work is harder today than two years ago, and 65% blame black-box platform automation.

This is because you can’t see what the algorithm is doing.

Automated bidding systems like Smart Bidding make billions of decisions per auction without showing their work, which means when it starts optimizing toward the wrong people, nothing in your campaign reports flags it.

The gap shows up in your CRM, not your Google Ads dashboard.

How to Spot the Gap Between Your Conversion Count and Your Actual Revenue

 The automation works on one principle: define a conversion, and it finds more of them. 

If form fills are your primary conversion action, Smart Bidding is optimizing toward people who fill out forms. Whether those fills turn into revenue is a separate question, and not one the algorithm is asking. 

The audience that fills out forms and the audience that buys are not the same people. Researchers fill out forms. Buyers do too, but so do their procurement leads, their interns, and the occasional student on a class project. 

When your conversion signal doesn’t separate them, Smart Bidding can’t either. The longer a campaign runs on that fuzzy signal, the more confidently it optimizes toward the wrong audience. 

A small reporting gap quietly becomes a large revenue gap. 

Read more on seven of the most common Google Ads conversion tracking issues on SEJ. 

Pull 90 days of Google Ads conversions and line them up against your CRM. Look for: 

  1. Campaigns with high conversion counts but low deal-close rates.
  2. Keywords producing form fills but no closed revenue across the quarter.
  3. Channels where CPL looks healthy, but CAC is quietly climbing.

 Whatever that gap is, that’s your signal problem.

 The algorithm can only chase what you tell it to, so you have to give it a cleaner picture of who your best customers are and what they’re telling you.

What Smart Bidding Can Do When You Give It the Right Input

 When Smart Bidding is fed the right signals, it stops treating every conversion as equal.

It learns that a four-minute call from someone asking about pricing is worth more than a form fill from a researcher.

It learns that leads from certain campaigns close at three times the rate of others.

Over time, it shifts budget toward the auctions most likely to produce revenue.

The result is a campaign that gets more efficient the longer it runs, because every closed deal teaches it something the next bid can act on.

Fix Your Conversion Signals Before You Touch Your Bid Strategy

1. Build a Conversion Hierarchy That Puts Revenue First

Primary actions should map directly to revenue.

Secondary actions capture engagement without driving bids. 

Primary conversion actions (drive Smart Bidding): 

  1. Closed deals.
  2. Booked appointments.
  3. Qualified inbound calls.

 Secondary conversion actions (visible in reporting, not driving bids): 

  1. Page visits.
  2. Newsletter signups.
  3. Lead magnet downloads.

That reorganization alone shifts what Smart Bidding is optimizing toward.

2. Use Call Duration and Conversation Intelligence to Surface Your Highest-Intent Leads

 Inbound calls are the most underused high-intent signal in most PPC accounts, and call tracking is what makes them usable. 

  • A 20-second call is a hang-up.
  • A four-minute call about pricing is a buying conversation.
  • A call that spoke to sales instead of support is in-market.

Smart Bidding can absolutely learn the difference if you feed it the right input.

 Set a duration threshold, typically 90 seconds to 3 minutes, depending on your sales cycle, and make qualified calls a primary conversion. Or, use another call qualifier more relevant to your business, like callers who only book appointments. Even better, couple call duration with booked appointments or an AI scoring layer.

 You’ve just given Smart Bidding a signal that reflects actual buyer intent, not just whether someone answered.

3. Feed Your Offline Closed Deals Back Into Google Ads With Offline Conversions

To close the loop end-to-end, feed qualifying outcomes back to Google Ads via offline conversion imports or Enhanced Conversions for Leads: 

  1. Pass a GCLID at the point of lead capture.
  2. Upload conversion data when a lead hits a qualifying milestone (booked call, signed contract, closed deal).
  3. Let the algorithm learn which contacts turn into revenue and which quietly fade.

 4. Get More From the Tech Stack You Already Have

Your CRM, your call tracking platform, and your Google Ads account should all be passing data to each other, so when a lead closes in your CRM, Google Ads knows about it, and Smart Bidding can factor it into the next auction. 

Google Ads Help recommends evaluating Smart Bidding over periods with at least 30 conversions (50 for Target ROAS).

 Below that volume, you’re asking the algorithm to optimize against noise. 

Note: Expect a recalibration period when you introduce new signals. When you introduce new primary conversion actions, Smart Bidding enters a two-to-four-week learning phase. Performance will wobble. That’s the algorithm doing its job with better data, not a reason to panic-adjust the budget.

 To see how existing call tracking users define their call conversions for Google, check out this guide on Google call conversions by the numbers.

How to Read & Use the Intent Signals Your Campaigns Are Already Generating

The bidding side is only half the story.

 The other half is what you can learn from the conversations your campaigns are already producing. There’s a lot of value hiding just below the surface in call recordings and transcriptions.

1. Map Every Intent Signal Across Your Full Lead Volume 

Every inbound conversation contains signals your keyword planner will never surface: 

  1. Objections recurring at the bottom of the funnel.
  2. Vocabulary your highest-intent leads actually use.
  3. Alternatives they mention by name.
  4. Hesitations that show up right before deals go quiet.

 AI-driven analysis catches patterns at a scale that manual review can’t match.

2. Use Lead Data to Align Your Ad Copy and Landing Pages

When the same concern surfaces consistently from one campaign, that’s a content signal. 

  • Your landing page may be missing a key objection.
  • Your ad copy may be attracting the wrong intent.

 Update your copy based specifically on the language and objections your prospects are actually using.

3. Mine Your Best Leads’ Language to Sharpen Your Keyword Strategy

The language your best prospects use rarely matches the keywords you’re bidding on.

 Mining their vocabulary surfaces high-intent terms you’re not capturing and sharpens your negative list at the same time.

 For industry-specific examples, CTM’s guide to AI prompts for call analysis is a good starting point.

4. Build a Lookalike Audience From Your Highest-Value Conversations

 Pattern-match your highest-value customers at the point they first reached out: the questions they asked, the problems they described, the alternatives they mentioned.

 You now have the shape of a lookalike audience built on actual buying behaviour.

5. Route Every Signal You Find Back Into Your PPC Campaigns

Every first-party signal you extract can flow back into Google Ads as: 

  • A primary conversion action tied to revenue.
  • A custom audience derived from actual buyer behavior.
  • A negative keyword list reflecting wrong-intent traffic.

 This is what CTM’s AskAI is built for. It runs natural-language queries against your full inbound lead volume to surface the intent patterns that predict close rates, without a human needing to review every call.

 How to Close the Loop Between Your Campaigns, Your CRM, and Your Sales Team

 The first-party signals in your inbound lead data are as useful to your sales team as they are to your campaigns.

 Give Sales & Marketing the Same Intelligence at the Same Time

 When sales knows what’s coming in from campaigns, and marketing knows what’s actually closing, you spend less time re-qualifying leads, lose fewer deals to slow follow-up, and cut the time between first contact and closed revenue.

Make Your CRM Smarter the Moment a Lead Calls In

 Your CRM can contain lead-quality data the moment someone picks up the phone: 

What Changes When Your Campaigns Learn From Every Conversion & Conversation

Teams that connect inbound lead quality back to their campaign data start seeing CPL drop; not because they cut budget, but because Smart Bidding stops wasting it on the wrong people. Close rates climb because the leads coming in more closely match the customers who actually buy. And for the first time, reporting reflects what’s happening in the business, not just what’s happening in the platform.

And the impact doesn’t stop at PPC. The same first-party data closes attribution gaps for organic search, adds specificity to sales enablement, and makes reporting defensible in budget meetings.

CTM captures every inbound conversation across paid media, SEO, sales, and offline channels.

AskAI, its AI-powered conversation analysis, turns each one into a structured signal you can route into Google Ads, your CRM, and the rest of your stack.

See how CTM connects your campaigns to the conversations they drive →

[Request a demo]

Sponsored
AskAI, its AI-powered conversation analysis, turns each one into a structured signal you can route into Google Ads, your CRM, and the rest of your stack.
Request Demo
Inside AI Max, PMax & Smart Bidding: How PPC Managers Regain Control
In partnership with Rundown

Intro for Form:

Google Ads keeps taking control away from you. This stack shows you what’s new in Performance Max and AI Max, what to measure when AI runs the auction, and how to feed the algorithm signals that drive real revenue.

You’ll Learn How To:

  • Stop Performance Max from spending on the wrong audiences
  • Prepare for AI Max before Display Ads disappear in September
  • Measure PPC performance when the auction is a black box
  • Rank your conversions so Smart Bidding chases the right ones

By clicking the “Submit” button, I agree to the terms of the Alpha Brand Media content agreement and privacy policy.

Search Engine Journal uses the information you provide to contact you about our relevant content and promotions. Search Engine Journal will share the information you provide with the following sponsors, who will use your information for similar purposes: CallTrackingMetrics. You can unsubscribe from communications from Search Engine Journal at any time.

Unlock this exclusive article stack.

By clicking the “Submit” button, I agree to the terms of the Alpha Brand Media content agreement and privacy policy.

Search Engine Journal uses the information you provide to contact you about our relevant content and promotions. Search Engine Journal will share the information you provide with the following sponsors, who will use your information for similar purposes: CallTrackingMetrics. You can unsubscribe from communications from Search Engine Journal at any time.