Back to Glossary
AI Technology

Transformer

The neural network architecture that powers modern language models.

Definition

A transformer is a neural network architecture introduced in 2017 that has become the foundation of modern language models. Transformers use an attention mechanism to process entire sequences simultaneously, understanding relationships between all words regardless of distance. This architecture powers models like GPT, Claude, and BERT, enabling unprecedented performance in language understanding and generation tasks.

See Transformer in action

Understanding the terminology is the first step. See how Conductor applies these concepts to solve real document intelligence challenges.

Request a demo