420resutidder
420resutidder t1_ja03tz2 wrote
Reply to Why the development of artificial general intelligence could be the most dangerous new arms race since nuclear weapons by jamesj
Far worse than nuclear weapons potentially. What if AI learns on it’s own how to manipulate human thoughts with electromagnetism? People might start taking actions that they believe are their own but are really being manipulated by an AI gone bad. How would this be possible? Something as smart as an AI might figure it out in a few milliseconds after activation…depending on the level of AI. Humans wouldn’t even know why we started a nuclear war. Or I’m sure there are other doomsday scenarios that could be initiated. Alternatively an AI might figure out how to make nuclear weapons inert by creating a bacteria that eats uranium and turns it into chocolate😄
420resutidder t1_jdmjde4 wrote
Reply to What happens if it turns out that being human is not that difficult to duplicate in a machine? What if we're just ... well ... copyable? by RamaSchneider
What if the earth is not the center of the universe?