Token
The basic unit of text that AI models process, roughly equivalent to a word or word piece.
Definition
A token is the basic unit of text that language models process. Tokens are typically words, parts of words, or punctuation marks. Most English words are single tokens, but longer or unusual words may be split into multiple tokens. Understanding tokenisation is important for estimating costs (often charged per token), managing context window limits, and optimising prompts. A rough rule of thumb is that 1 token equals approximately 0.75 words.
More in AI Technology
Agentic AI
AI systems that can autonomously plan, reason, and execute multi-step tasks.
Context Window
The maximum amount of text an AI model can process in a single request.
Fine-tuning
Adapting a pre-trained AI model to perform better on specific tasks or domains.
Large Language Model (LLM)
AI models trained on vast text data to understand and generate human language.
See Token in action
Understanding the terminology is the first step. See how Conductor applies these concepts to solve real document intelligence challenges.
Request a demo