Submitted by QuicklyThisWay t3_10wj74m in news
No-Reach-9173 t1_j7ras30 wrote
Reply to comment by imoftendisgruntled in ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die by QuicklyThisWay
AGI doesn't have to include sentience. We just kind of assume it will because we can't imagine that level of intelligence without and we are still so far from an AGI we don't really have a grasp of what will play out.
Viewing a single comment thread. View all comments