All Glossary Terms
Tokenisation
TokenTokenisation is how LLMs break text into tokens — small units they can process. Token limits affect context windows, cost, and content processability.
Technical Foundations
Tokenisation is how LLMs break text into tokens — small units they can process. Token limits affect context windows, cost, and content processability.