AI & Automation
drupal planet
seo

AI Search and Your Website: How to Stay Visible in the Age of Overviews

Duncan Worrell

Duncan Worrell

Technical Client Services Manager

Robotic hand holding a magnifying glass

Introduction

The way people discover information online is evolving at pace. Not long ago, success meant second-guessing which keywords to type into a search box and hoping a site with the information you wanted appeared at the top of a neat list of blue links. Businesses spent years fine-tuning meta tags and tinkering with HTML in an effort to please opaque search algorithms. Looking back, it almost feels quaint.

Today, users are far more likely to type full questions into Google or ask AI assistants such as ChatGPT, Perplexity, or Gemini to give them instant, conversational answers. Increasingly, the response they receive is drawn directly from web content — but without necessarily sending visitors back to the original site.

For years, there was an unspoken deal between organisations and search engines: “we’ll let you crawl our content, and in return you’ll drive visitors to our site.” That exchange is weakening. AI assistants and search engines still benefit from your content, but the flow of traffic to your website is steadily shrinking.

For businesses, this shift raises three pressing questions:

  • How do you make sure your content is visible to AI systems?
  • How do you measure that visibility when clicks may not follow?
  • And how do you maximise the value of the visitors you do get?

 

Google AI overviews

Google is now weaving artificial intelligence directly into its search results, producing AI Overviews that appear right at the top of the page. These are bite-sized, summarised answers to queries, often stitched together from multiple websites. For businesses, this shift is both a risk and an opportunity.

Just as you once wanted to rank at the top of the search results, the new goal is to become one of the trusted sources cited in AI Overviews. And while it’s true that some questions may be answered before a visitor ever clicks through, those who do choose to continue to your site are often more motivated — the higher-value visitors who are actively looking for deeper detail, services, or products.

The key is to prepare now: optimising for AI Overviews is about staying visible in a world where the number of clicks may shrink, but the quality of those clicks could rise.

A further evolution is Google’s AI Mode which goes beyond Overviews to deliver full generative responses that combine multiple sources, support follow-ups, and surface richer content. This raises the bar: not only does your site need to be Overview-able, but it also needs to be seen as a reliable seed for deeper AI responses.

 

Illustration if a robot with a magnifying glass.

Measuring the impact of AI traffic

Right now, Google doesn’t separate traffic from AI Overviews in GA4 or Search Console, it all shows up as standard organic traffic. That means you can’t directly see how often your content has been pulled into an Overview. Still, there are practical ways to build visibility into AI-driven traffic today.

The starting point is GA4. By creating a dedicated traffic channel for AI (using regex to capture sessions from platforms like ChatGPT, Perplexity, Copilot and Gemini), you can start segmenting this audience. High-level dashboards in Looker Studio then allow you to track key signals such as:

  • Sessions and engagement: comparing AI-driven visits against traditional search.
  • Conversions: measuring whether these visitors arrive with clearer intent.
  • Trends: monitoring how this channel grows over time.

To go further, server log analysis can reveal when AI bots are consuming your pages — a useful signal that your content is being used to generate answers, even if it doesn’t always produce a click. Shifts in impressions versus clicks in Search Console can also hint at where Overviews are competing with your normal search visibility.

What we’re seeing already is clear: AI-driven visitors often view fewer pages, but they convert at higher rates, because they arrive with intent. At the same time, bot traffic gives you a window into which of your pages are directly shaping generative answers.

The message is simple; GEO (Generative Engine Optimisation) is the natural evolution of SEO. It’s no longer just about ranking in Google, but about being recognised as a trusted source for AI systems. The next steps for most organisations are straightforward: set up AI tracking in GA4, build executive dashboards, monitor AI bot activity, and review which content carries the most generative weight. Quarterly benchmarking then allows you to measure how AI is shaping acquisition and to adjust investments across SEO, GEO, and content accordingly.

Even without official Overview metrics, this framework gives you a way to uncover the hidden value your content is creating in the AI ecosystem — and turn uncertainty into competitive advantage.

 

Guiding AI to your website’s best content

A new idea is starting to gain traction: creating an llms.txt file that acts as a curated “highlight reel” of your most valuable content, presented in a format that large language models can easily consume. Think of it as the AI-era cousin of robots.txt.

Alongside is llms-full.txt, an enhanced version that works much like a sitemap.xml, listing the content you would want Large Language Models (LLMs) to see, but with semantic guidance. Both files are written in Markdown, making them human- and machine-readable, and placed in your site’s root directory. Tools and Drupal modules are already emerging to make generating and maintaining these files straightforward.

Why it matters:

Boost the chances of your site being cited in AI-generated answers.

Highlight the pages you want AI systems to treat as priority.

Future-proof your digital presence for a search landscape that’s expanding beyond traditional engines.

It’s still early days, and there’s no official confirmation from the major AI providers on whether they’ll adopt this standard. That said, there are signs that at least some systems are already looking for and using these files.

Using llms.txt or llms-full.txt (or the similar ai.txt) is currently a signal or best-practice experiment. It’s speculative, rather than a guaranteed control mechanism.

To maximise the chance of compliance, the best “recognised control” today remains robots.txt, which is widely respected by major crawlers, including OpenAI’s.

Complement any use of llms.txt with server log monitoring, crawl tracking, and explicit APIs (or structured data) to guide AI systems.

 

From SEO to GEO: The foundations still matter

Making your site AI-friendly isn’t radically new. Most of the core principles of good SEO still apply. Clear, authoritative, well-structured content is what AI systems surface, just as search engines always have. What’s changing is the format and visibility: AI assistants rely more heavily on structured signals, concise answers, and attribution metadata. If your site is already well-optimised for humans and search engines, you’re halfway there; the extra step is making that content machine-readable and compliant so it can be safely included in AI summaries.

Key focus areas:

  • Keep content structured, semantic, and accessible (headings, schema.org, alt text).
  • Maintain high-quality, authoritative, up-to-date content with clear attribution.
  • Use feeds, APIs, or files like llms.txt to guide AI crawlers directly.
  • Ensure compliance and transparency (GDPR-friendly banners, consent logging).
  • Optimise performance, linking, and internal hierarchy so context isn’t lost.

Just like SEO, tracking how AI traffic is affecting your website is the first step towards maximising GEO value.

Some have speculated about a future where websites exist only as Markdown feeds for machines. We don’t believe that’s realistic — design, usability, and human experience still matter enormously — but it’s a reminder of how fast the digital ecosystem is shifting, and why it’s worth preparing now.

  • Data to refine your strategy. Measuring AI presence and Google AI Overviews performance reveals what’s working and where to focus efforts.
  • Early adopters gain prominence. Sites that adapt early are more likely to secure a place in AI-generated answers before competitors.
  • Greater control over digital presence. You decide which content takes priority rather than relying solely on algorithms.
  • Competitive advantage. Adapting early could be the difference between becoming a primary source or fading into obscurity.

 

Summary

The search landscape is shifting fast, with AI systems like Google’s Overviews and tools such as ChatGPT, Perplexity, and Gemini changing how people find and consume information. While this creates uncertainty, it doesn’t mean starting from scratch. The foundations of good SEO — clear, structured, authoritative content — remain vital. What’s new is the need to make that content machine-readable, track AI-driven traffic, and use emerging tools like llms.txt to signal priority pages. Early adopters who embrace Generative Engine Optimisation (GEO) will secure visibility, attract higher-value visitors, and maintain a competitive edge in an AI-first search world.

How can we help?

If you want to chat about mastering AI in our fast changing world, get in touch
How can we help?