1. SEJ
  2.  ⋅ 
  3. SEO Pulse

SEO Pulse: Google’s AI Mode Gets Personal, AI Bots Blocked, Domains Matter in Search

Google connects Gmail and Photos to AI Mode. Hostinger data shows AI training bots blocked while search bots expand. Mueller warns about free subdomain hosting.

SEO Pulse: Google’s AI Mode Gets Personal, AI Bots Blocked, Domains Matter in Search

Welcome to the week’s SEO Pulse. This week’s updates affect how AI Mode personalizes answers, which AI bots can access your site, and why your domain choice still matters for search visibility.

Here’s what matters for you and your work.

Google Connects Gmail And Photos To AI Mode

Google is rolling out Personal Intelligence, a feature that connects Gmail and Google Photos to AI Mode in Search, delivering personalized responses based on users’ own data.

Key facts: The feature is available to Google AI Pro and AI Ultra subscribers who opt in. It launches as a Labs experiment for eligible users in the U.S. Google says it doesn’t train on users’ Gmail inbox or Photos library.

Why This Matters

This is the personal context feature Google promised at I/O but delayed until now. We covered the delay in December when Nick Fox, Google’s SVP of Knowledge and Information, said the feature was “still to come” with no public timeline.

For the 75 million daily active users Fox reported in AI Mode, this could reduce how much context you need to type to get tailored responses. Google’s examples include trip recommendations that factor in hotel bookings from Gmail and past travel photos, or coat suggestions that account for preferred brands and upcoming travel weather.

The SEO effects depend on how this changes query patterns. If users rely on Google pulling context from their email and photos instead of typing it, queries may get shorter and more ambiguous. That makes it harder to target long-tail searches with explicit intent signals.

What People Are Saying

The early social reaction is framing this as Google pushing AI Mode from “ask and answer” into “already knows your context.” Robby Stein, VP of Product at Google Search, positioned it as a more personal search experience driven by opt-in data connections.

On LinkedIn, the discussion quickly moved to trust and privacy tradeoffs. Michele Curtis, a content marketing specialist, framed personalization as something that only works when trust comes first.

Curtis wrote:

“Personalization only works when trust is architected before intelligence.”

Syed Shabih Haider, founder of Fluxxy AI, raised security concerns about connecting multiple apps.

Haider wrote:

“Personal Intelligence.. yeah the features/benefits look amazing.. but cant help but wonder about the data security. Once all apps are connected, the risk for breach becomes extremely high..”

Read our full coverage: Google Launches Personal Intelligence In AI Mode

AI Training Bots Lose Access While Search Bots Expand

Hostinger analyzed 66 billion bot requests across more than 5 million websites and found AI crawlers are following two different paths. Training bots are losing access as more sites block them. Search and assistant bots are expanding their reach.

Key facts: Hostinger reports 55.67% coverage for GPTBot and 55.67% average coverage for OAI-SearchBot, but their trajectories differ. GPTBot, which collects training data, fell from 84% to 12% over the measurement period. OAI-SearchBot, which powers ChatGPT search, reached that average without the same decline. Googlebot maintained 72% coverage. Apple’s bot reached 24.33%.

Why This Matters

The data confirms what we’ve tracked through multiple studies over the past year. BuzzStream found 79% of top news publishers block at least one training bot. Cloudflare’s Year in Review showed GPTBot, ClaudeBot, and CCBot had the highest number of full disallow directives. The Hostinger data puts numbers on the access gap between training and search crawlers.

The distinction matters because these bots serve different purposes. Training bots collect data to build models, while search bots retrieve content in real time when users ask questions. Blocking training bots opts you out of future model updates, and blocking search bots means you won’t appear when AI tools try to cite sources.

As a best practice, check your server logs to see what’s hitting your site, then make blocking decisions based on your goals.

What People Are Saying

On the practical SEO side, the most consistent advice is to separate “training” from “search and retrieval” in your robots decisions where you can. Aleyda Solís previously summarized the idea as blocking GPTBot while still allowing OAI-SearchBot, so your content can be surfaced in ChatGPT-style search experiences without being used for model training.

Solís wrote:

“disallow the ‘GPTbot’ user-agent but allow ‘OAI-SearchBot'”

At the same time, developers and site operators keep emphasizing the cost side of bot traffic. In one r/webdev discussion, a commenter said AI bots made up 95% of requests before blocking and rate limiting.

A commenter in r/webdev wrote:

“95% of the requests to one of our websites was AI bots before I started blocking and rate limiting them”

Read our full coverage: OpenAI Search Crawler Passes 55% Coverage In Hostinger Study

Mueller: Free Subdomain Hosting Makes SEO Harder

Google’s John Mueller warned that free subdomain hosting services create SEO challenges even when publishers do everything else right. The advice came in response to a Reddit post from a publisher whose site shows up in Google but doesn’t appear in normal search results.

Key facts: The publisher uses Digitalplat Domains, a free subdomain service on the Public Suffix List. Mueller explained that free subdomain services attract spam and low-effort content, making it harder for search engines to assess individual site quality. He recommended building direct traffic through promotion and community engagement rather than expecting search visibility first.

Why This Matters

Mueller’s guidance fits a pattern we’ve covered over the years. Google’s Gary Illyes previously warned against cheap TLDs for the same reason. When a domain extension becomes overrun by spam, search engines may struggle to identify legitimate sites among the noise.

Free subdomain hosting creates a specific version of this problem. While the Public Suffix List is meant to treat these subdomains as separate registrable units, the neighborhood signal can still matter. If most subdomains on a host contain spam, Google’s systems have to work harder to find yours.

This affects anyone considering free hosting as a way to test an idea before buying a real domain. The test environment itself becomes part of the evaluation. As Mueller wrote, “Being visible in popular search results is not the first step to becoming a useful & popular web presence.”

For anyone advising clients or building new projects, the domain investment is part of the SEO foundation. Starting on a free subdomain may save money upfront, but it adds friction to visibility that a proper domain avoids.

What SEO Professionals Are Saying

Most of the social sharing here is treating Mueller’s “neighborhood” analogy as the headline takeaway. In the original Reddit exchange, he said publishing on free subdomain hosts can mean opening up shop among “problematic flatmates,” which makes it harder for search systems to understand your site’s value in context.

Mueller wrote:

“opening up shop on a site that’s filled with … potentially problematic ‘flatmates’.”

On LinkedIn, the story is being recirculated as a broader reminder that “cheap or free” hosting decisions can quietly cap performance even when everything else looks right. Fernando Paez V, a digital marketing specialist, called it out as a visibility issue tied to spam-heavy environments.

Paez V wrote:

“free subdomain hosting services … attract spam and make it more difficult for legitimate sites to gain visibility”

Read our full coverage: Google’s Mueller: Free Subdomain Hosting Makes SEO Harder

Theme Of The Week: Access Is The New Advantage

This week’s stories share a common element. Access, whether to personal data, to websites via bots, or to fair evaluation by choosing the right domain, shapes outcomes before any optimization happens.

Personal Intelligence gives AI Mode access to your email and photos, changing what kinds of queries even need to happen. The Hostinger data shows search bots gaining access while training bots get locked out. Mueller’s subdomain warning reminds us that domain choice determines whether Google’s systems give your content a fair evaluation at all.

The common thread is that visibility increasingly depends on what you allow in and where you build. Blocking the wrong bots can reduce your chances of being surfaced or cited in AI tools. Building on a spam-heavy domain puts you at a disadvantage before you write a word. And Google’s AI features now have access to personal context that publishers can’t access or observe.

For practitioners, this means access decisions, both yours and the platforms’, shape results more than incremental optimization gains. Review your crawler permissions and domain choices, and watch how personal context in AI Mode changes the queries you’re trying to rank for.

Top Stories Of The Week:

More Resources:


Featured Image: Accogliente Design/Shutterstock

Category SEO Pulse
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...