Techniques · intermediate

What is Distillation?

A plain-English explanation of Distillation (Knowledge Distillation) — what it means, why it matters, and how it is used in AI.

Distillation
Knowledge Distillation
Knowledge distillation is a model compression technique where a smaller "student" model is trained to mimic the behaviour of a larger "teacher" model.
"DistilBERT is a distilled version of BERT that is 40% smaller and 60% faster while retaining 97% of BERT's language understanding performance."

Also known as: Knowledge distillation, model distillation, teacher-student learning

Why does Distillation matter?

Distillation creates smaller, faster, cheaper models suitable for deployment on devices with limited compute resources.

Practice this term

The best way to remember Distillation is to practice unscrambling it. AI Terminology Scrambler uses spaced repetition to help you learn and retain AI vocabulary in just a few minutes a day.

Practice Distillation now →

Related AI terms