LLM Knowledge Base

Jailbreak

In the context of Generative AI, "jailbreak" refers to the process of removing or circumventing restrictions imposed by the LLMs original developers. This allows users to gain access to additional functionalities, customization options, or the ability to run unauthorized or third-party software that is not typically permitted within the standard operating parameters. Jailbreaking in the traditional sense is associated with smartphones and other devices, but in the realm of Generative AI, it involves modifying the behavior of AI models to perform tasks or operate in ways not originally intended by their creators.

See also "Prompt Injection Attack".

PROMPTMETHEUS © 2024