Models · beginner

What is GPT?

A plain-English explanation of GPT (Generative Pre-trained Transformer) — what it means, why it matters, and how it is used in AI.

GPT
Generative Pre-trained Transformer
GPT stands for Generative Pre-trained Transformer. It is a family of large language models developed by OpenAI that uses the Transformer architecture to generate human-like text. GPT models are pre-trained on vast amounts of internet text and can be fine-tuned for specific tasks.
"GPT-4 is used in ChatGPT to answer questions, write code, summarise documents, and assist with creative writing."

Also known as: Generative Pre-trained Transformer

Why does GPT matter?

GPT models are used in chat applications, coding assistants, content generation tools, and enterprise AI systems.

Practice this term

The best way to remember GPT is to practice unscrambling it. AI Terminology Scrambler uses spaced repetition to help you learn and retain AI vocabulary in just a few minutes a day.

Practice GPT now →

Related AI terms