Jailbreakbench is an open-source robustness benchmark for jailbreaking large language models (LLMs). The goal of this benchmark is to comprehensively track progress toward (1) generating successful ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results
Feedback