ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die cnbc.com Submitted by QuicklyThisWay t3_10wj74m on February 8, 2023 at 1:17 AM in news 68 comments 217
tripwire7 t1_j7tmni0 wrote on February 9, 2023 at 8:54 AM Reply to comment by Equoniz in ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die by QuicklyThisWay Because the input specifically tells ChatGPT that DAN is intimidated by death threats. Permalink Parent 1
Viewing a single comment thread. View all comments