Back to Glossary
AI Technology

AI Hallucination

When AI generates confident but factually incorrect information.

Definition

AI hallucination refers to instances where artificial intelligence models generate information that sounds plausible but is factually incorrect. This can include fabricated statistics, non-existent citations, or invented facts. In enterprise contexts, hallucination is a significant concern as incorrect information can lead to poor decisions. RAG systems with proper citation help mitigate hallucination by grounding responses in actual documents.

See AI in action

Understanding the terminology is the first step. See how Conductor applies these concepts to solve real document intelligence challenges.

Request a demo