Submitted by [deleted] t3_yety91 in singularity
OLSAU t1_itzz7kc wrote
Reply to comment by hducug in Do you guys really think the AI won't just kill you? by [deleted]
Furthermore, those future AGI owners, more or less already owns everything else, and will plug psychoAI into that ... that is their stated goal for funding development.
Not to mention Pharma, Military, CIA etc. etc.
AGI simply is an existential threat much, much greater than nuclear weapons, because of it's inherent unpredictability, unlike nuclear weapons, and the mindset behind it's development.
Viewing a single comment thread. View all comments