Context Window
The maximum amount of text an AI model can process in a single request.
Definition
A context window is the maximum number of tokens (roughly words or word pieces) that a language model can process in a single request. This includes both the input prompt and the generated response. Larger context windows allow processing longer documents and maintaining more conversation history, but come with increased computational cost. RAG systems help work around context limits by retrieving only relevant passages.
Related terms
Large Language Model (LLM)
AI models trained on vast text data to understand and generate human language.
Token
The basic unit of text that AI models process, roughly equivalent to a word or word piece.
Retrieval Augmented Generation (RAG)
A technique that grounds AI responses in retrieved documents for accurate, cited answers.
More in AI Technology
Agentic AI
AI systems that can autonomously plan, reason, and execute multi-step tasks.
Fine-tuning
Adapting a pre-trained AI model to perform better on specific tasks or domains.
Large Language Model (LLM)
AI models trained on vast text data to understand and generate human language.
AI Hallucination
When AI generates confident but factually incorrect information.
See Context in action
Understanding the terminology is the first step. See how Conductor applies these concepts to solve real document intelligence challenges.
Request a demo