The future of SEO is LLMO, says Mike King

Google Gemini is now on top of Google’s search results. It will even copy the exact phrasing of the first result. AI Mode marks a fundamental shift in how search engines operate. It’s not an iteration of the old system—it aims to replace it. The model moves from indexing pages and ranking results to synthesising answers using generative AI, persistent user context, and semantic relevance. For SEO professionals and link builders, this isn’t an update. It’s a new game.

Mike King’s recent breakdown (read it!) of Google’s AI Mode isn’t just a technical post—it’s a warning shot. For years, SEO professionals have been optimising for an ecosystem of pages, links, and keywords. But AI Mode, as King explains, wipes that slate clean. This isn’t Search 2.0. It’s a new machine entirely—built on generative AI, persistent user context, and semantic synthesis.

According to King, if you’re still chasing rankings the old way, you’re already behind. Here’s why.

“It’s not search anymore—it’s synthesis”

King makes one thing very clear: AI Mode doesn’t operate like a traditional search engine. It doesn’t return documents. It constructs responses.

Using large language models (LLMs), Google interprets your query, factors in your search history and behaviour, and then builds an answer from passage-level data pulled from multiple sources. It’s less about finding the best match and more about creating the best response.

As King writes, “AI Mode no longer returns a ranked list of results. It provides a generated answer built from real-time retrieval of content.” That means your content needs to be findable at the passage level, not just the page level.

Persistent context means one-size-fits-all content won’t cut it

One of King’s more subtle but important points is that AI Mode carries context forward. It doesn’t forget what a user just asked. It builds a model of their intent and adjusts future answers accordingly.

That breaks the fundamental logic of SEO as we know it. There’s no universal SERP anymore. Every user could get a slightly different, dynamically generated answer—even for the same query.

Your content won’t just compete against others—it will be filtered through the lens of each user’s unique session. King emphasises that this makes classic optimisation techniques brittle. “We’re no longer optimising for a stable SERP,” he writes. “We’re optimising for an evolving conversation.”

Pages don’t rank—passages do

In what may be the most impactful technical shift, King explains how AI Mode retrieves and ranks information at the passage level. Your beautifully structured long-form blog post? Irrelevant if the key insight is buried five paragraphs deep.

Google parses your content into chunks and only retrieves what fits semantically. As King notes, “It’s about extractability. Can your paragraph stand alone and serve as a direct contribution to an answer?”

This forces a complete rethink of content architecture. Every section needs to be independently valuable. Forget fluff intros, keyword padding, or filler content. They won’t be indexed, much less surfaced.

Vector space kills keyword-first SEO

King dedicates a large portion of his article to the limitations of existing SEO tools. The core issue? Most are designed for lexical matching—string-based comparisons, keyword density, backlinks.

AI Mode doesn’t care about any of that.

Instead, it maps both queries and content into high-dimensional vector spaces. The relevance of a passage is determined by its proximity—mathematically—to the user’s intent, not its match to a keyword.

King is blunt here: “Our tools weren’t built for this world.” Without visibility into vector embeddings and semantic similarity, most SEOs are operating in the dark. No matter how well you’ve optimised by old standards, the new engine simply doesn’t see your work the same way.

Relevance engineering is the new SEO

King’s proposed response to all this upheaval? Relevance engineering.

In short: your job is no longer to optimise pages—it’s to engineer meaning. To create content that LLMs can understand, repurpose, and recombine in service of an answer. This includes:

  • Writing passage-level content that answers specific intents
  • Ensuring semantic clarity, not keyword frequency
  • Structuring content so it can be lifted and re-used
  • Thinking in terms of use cases, not just traffic potential

Even link building, King notes, will need to adapt. “A link in a semantically aligned paragraph may carry more relevance than one from a high-authority homepage.” It’s not about raw authority anymore. It’s about thematic fit.

Prepare or disappear

Mike King’s final message is unambiguous: adapt or vanish. He doesn’t say SEO is dead. He says traditional SEO is now a relic. The skills that made you successful five years ago won’t serve you in a search landscape where the engine decides what an answer looks like—and builds it in real time.

To keep up, SEOs must become semantic strategists. Content creators must write for AI readers as much as for humans. And the entire ecosystem must shift from keyword targeting to knowledge delivery.

The game hasn’t changed. The game has ended. A new one has started—and Mike King just showed us the rules.

Frequently Asked Questions

1. What is AI Mode in Google Search?
AI Mode is Google’s shift from traditional search results to AI-generated answers. Instead of returning a ranked list of blue links, Google now uses large language models (LLMs) to understand your intent, retrieve content fragments (passages) from various sources, and generate a coherent, conversational response. This changes the user experience from search-and-click to direct answer consumption—and it reduces the visibility of standard organic listings.

2. What does Mike King mean by “AI Mode replaces traditional search”?
Mike King argues that AI Mode fundamentally breaks the old search model. Where SEO used to be about optimising whole pages for specific keywords to appear in ranked results, AI Mode pulls out only relevant content fragments and assembles a unique answer for each user. You’re no longer competing to be position 1—you’re competing to be part of the machine’s response.

3. What is LLMO?
LLMO stands for Large Language Model Optimisation. It’s a new approach to content strategy focused on how LLMs like Google’s Gemini (formerly Bard) interpret and use information. Instead of just matching queries to keywords, LLMO involves structuring content so it’s readable, extractable, and semantically rich—making it more likely to be selected and synthesised into AI-generated answers.

4. What is GEO (Generative Engine Optimisation)?
GEO is an emerging term (and play on SEO) that reflects how marketers now need to optimise content for generative engines instead of search engines. While SEO aimed to boost rankings in the SERP, GEO aims to increase the chance that your content is included in an AI response. It focuses on content clarity, contextual value, and being semantically relevant enough to be lifted as a passage.

5. How does passage-level indexing affect my SEO strategy?
Instead of evaluating and ranking entire pages, AI Mode indexes and ranks at the passage level. That means Google might ignore most of your article and only use one paragraph—if it’s relevant. This forces a rethink of how you structure content: every section should be self-contained, topically focused, and able to provide standalone value. Fluff and long intros don’t get indexed. Precision and clarity do.

6. Can traditional SEO tools still help in an AI Mode world?
To a limited extent. Most current tools track rankings, keyword density, and backlinks—but these aren’t the main signals AI Mode uses. Since AI Mode relies on vector embeddings (mathematical representations of meaning), most tools can’t “see” what the AI sees. They won’t tell you how semantically relevant your content is to a query. New tools or models that can analyse content in vector space will become essential.

7. What is “relevance engineering” and why does it matter?
Relevance engineering, as coined by Mike King, is the process of creating content that’s structured and written to be useful to a generative AI system. This means anticipating how your text will be interpreted, broken down into passages, and potentially recombined into a response. It’s about clarity, modularity, and semantic depth—less about keyword targeting, more about being part of a helpful, synthesized answer.

8. Is link building still important with AI Mode?
Yes, but its value has changed. Backlinks may still play a role in indicating trust and authority, but what matters more now is the context of the link. A link embedded within a semantically relevant passage—on a topic closely related to the query—will likely have more influence than a sidebar or homepage link. Relevance is now more valuable than raw domain authority.

9. Does Google still use traditional ranking signals in AI Mode?
Possibly, but they’re no longer central. While backlinks, site structure, and keyword placement may still inform passage selection behind the scenes, AI Mode prioritises semantic understanding and contextual fit. The engine isn’t ranking documents; it’s assembling answers. So those traditional signals are background data—not the main criteria for visibility.

10. How can I future-proof my SEO strategy for AI Mode?
Shift your mindset from ranking pages to feeding answers. Write content that addresses specific questions clearly and concisely. Break long articles into logical, modular sections. Use structured subheadings. Avoid fluff or vague generalities. And start exploring tools that help you understand how your content performs in semantic space, not just SERPs. The future of SEO is about being useful to machines—so they can be useful to users.

We use cookies to improve your browsing experience on our site, show personalized content, and analyze site traffic. By continuing to use our site, you consent to our use of cookies. More info.