A plain-English explanation of Distillation (Knowledge Distillation) — what it means, why it matters, and how it is used in AI.
Also known as: Knowledge distillation, model distillation, teacher-student learning
Distillation creates smaller, faster, cheaper models suitable for deployment on devices with limited compute resources.
The best way to remember Distillation is to practice unscrambling it. AI Terminology Scrambler uses spaced repetition to help you learn and retain AI vocabulary in just a few minutes a day.
Practice Distillation now →