Ladda ner Linnk AI
•
Forskningsassistent
>
Logga in
insikt
-
Defense against Jailbreaking Attacks
LLM Jailbreaking Defense via Backtranslation
Proposing a defense method using backtranslation to protect LLMs from jailbreaking attacks.
1