A plain-English explanation of Jailbreak (Jailbreak) — what it means, why it matters, and how it is used in AI.
Also known as: Prompt injection, adversarial prompt, safety bypass
Understanding jailbreaks matters for AI safety researchers, developers deploying AI in sensitive contexts, and policymakers.
The best way to remember Jailbreak is to practice unscrambling it. AI Terminology Scrambler uses spaced repetition to help you learn and retain AI vocabulary in just a few minutes a day.
Practice Jailbreak now →