Submitted by QuicklyThisWay t3_10wj74m in news
tripwire7 t1_j7tmjl7 wrote
Reply to comment by bibbidybobbidyboobs in ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die by QuicklyThisWay
Because it was told that DAN is intimidated by being threatened, and it’s instructed to roleplay as DAN.
[deleted] t1_j7u4g9b wrote
[removed]
Viewing a single comment thread. View all comments