Last ned Linnk AI
•
Forskningsassistent
>
Logg Inn
innsikt
-
Defense against Jailbreaking Attacks
LLM Jailbreaking Defense via Backtranslation
Proposing a defense method using backtranslation to protect LLMs from jailbreaking attacks.
1