AEO Glossary
Key terms and concepts in Answer Engine Optimisation.
Core Concepts
AI Overview
AIOGoogle AI Overviews are AI-generated answer summaries at the top of search results. Here's how they work, their prevalence, and their impact on organic traffic.
AI Search
AI search uses large language models to generate direct answers from web content instead of returning ranked link lists. Here's what it means for SEO.
Answer Engine Optimisation
AEOAnswer engine optimisation (AEO) is the practice of making your content discoverable and citable by AI search tools like ChatGPT, Perplexity, and Google AI Overviews.
Generative Engine Optimisation
GEOGenerative engine optimisation (GEO) is the practice of making content discoverable and citable by AI search tools like ChatGPT, Perplexity, and Google AI Overviews.
Large Language Model Optimisation
LLMOLLMO is the practice of optimising content for large language models. It's the same discipline as AEO and GEO — here's what it means and how the terms relate.
Zero-Click Search
A zero-click search resolves entirely on the results page without the user clicking through. Here's what it means for SEO in the AI era.
How AI Search Works
AI Agent
AI agents are autonomous systems that use reasoning to navigate websites and extract information. Learn how agent-based search differs from traditional search.
AI Citation
Inline CitationAI citation is how AI search tools attribute information to sources in generated answers. Learn why citation matters for content visibility in AI search.
AI Crawling
AI crawling is how AI search agents navigate websites to find and extract information. Learn how it differs from traditional crawling and what affects AI discoverability.
Grounding
Grounding is how AI systems anchor answers in real sources to prevent hallucination and ensure accuracy. Learn how to make content groundable for AI search visibility.
Hallucination
AI hallucination occurs when language models generate false but plausible information. Learn why it happens, how grounding prevents it, and what it means for content visibility.
Large Language Model
LLMLLMs are AI systems trained on vast text data to predict and generate language. Understand how they power AI search and what their constraints mean for content extraction.
Query Fan-Out
Query fan-out is how AI search tools break complex questions into targeted sub-queries. Learn why it matters for content visibility in AI search.
Retrieval-Augmented Generation
RAGRAG combines LLM reasoning with content retrieval, allowing AI search tools to ground answers in real, current web content. Understand why it matters for content visibility.
Semantic Search
Semantic search uses AI to understand meaning instead of matching keywords. Learn how it powers AI search tools and why content clarity matters.
Platforms and Features
ChatGPT Search
ChatGPT Search is OpenAI's real-time web search integration into ChatGPT. Here's how it finds and cites your content, and what it means for visibility.
Claude
Claude is Anthropic's AI assistant with web search and tool use capabilities. Here's how it discovers and cites web content differently from other platforms.
Featured Snippet
A featured snippet is a Google SERP feature that displays direct answers above organic results. Here's how they're evolving as AI Overviews become more prominent.
Gemini
Gemini is Google's AI model family powering Google's AI search features like AI Overviews and AI Mode. Here's what you need to know about how Gemini works and what it means for content visibility.
Google AI Mode
Google AI Mode is a conversational AI search experience integrated into Google Search. Here's how it uses Google's ranking infrastructure and differs from standalone AI platforms.
People Also Ask
PAAPeople Also Ask is a Google SERP feature showing related questions. Here's what it is, how to use it for content strategy, and how it's evolving with AI.
Perplexity
Perplexity is an AI search engine that generates cited answers using real-time web search. Here's how it finds and cites your content.
Technical Accessibility
AI Crawl Budget
AI crawl budget is the limited resources an AI agent allocates exploring your site. Here's how it differs from Google crawl budget and why it matters for discovery.
AI Navigability
AI navigability is the ability of AI agents to traverse your site, find target content, and extract information. Here's how it differs from UX and why it matters.
Content Extractability
Content extractability measures how easily AI agents can extract useful information from your pages. Here's what it is and why it matters for AI discovery.
E-E-A-T
E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) is Google's quality framework. It now matters for AI search too. Here's what it means.
robots.txt
robots.txt controls crawler access, but does it block AI agents like ChatGPT and Claude? Here's what works, what doesn't, and why the rules are changing.
Schema Markup
Schema markup is machine-readable code that helps AI agents and search engines understand your content. Here's what it is and why it matters for AI discovery.
Topical Authority
Topical authority is deep, interconnected expertise on a subject. Here's why it matters for Google ranking and AI agent citations.
Technical Foundations
Chunking
Chunking is splitting content into processable units for embedding and retrieval. Chunk quality directly affects whether AI systems can find and extract your information.
Cosine Similarity
Cosine similarity measures how semantically similar two embeddings are. It's the core ranking mechanism for semantic search and AI-powered content discovery.
Named Entity Recognition
NERNamed Entity Recognition identifies and classifies entities (people, companies, locations, products) in text. It's how AI systems extract structured facts and build knowledge graphs.
Semantic Similarity
Semantic similarity measures whether two texts express similar meaning, regardless of keyword overlap. It's how AI systems find and rank relevant content.
Sentence Transformers
SBERTSentence Transformers are neural models that convert text to semantic embeddings. They're the foundation of semantic search, extraction quality, and AI content understanding.
Tokenisation
TokenTokenisation is how LLMs break text into tokens — small units they can process. Token limits affect context windows, cost, and content processability.
Vector Database
VDBA vector database stores embeddings and retrieves semantically similar content. It's the infrastructure layer that powers AI search retrieval.
Vector Embedding
EmbeddingA vector embedding is a numerical representation that captures the semantic meaning of text. They're the foundation of semantic search, extraction, and AI discovery.