Submitted by Sasuke_1738 t3_10lnfy7 in Futurology
Callisto_NTG t1_j5z4zwk wrote
Reply to comment by rypher in Are most of our predictions wrong? by Sasuke_1738
What’s that mean
rypher t1_j5z8niv wrote
“AI is supposed to help us not take over for us”
While that may be true, it doesnt have any impact on whether that will be the outcome. AI might not help us overall (probably just exasperate the income inequality) and might take over (not like become our overlords, but slowly take more important decisions away from humans in terms of war/ medicine/ geopolitics).
Its not supposed to happen is like thinking we are not supposed to break the law, start wars, eat too much sugar. “Supposed to” doesnt really mean much.
Callisto_NTG t1_j5zbq32 wrote
Yeah well we should be planning ahead so that these bad scenarios don’t happen is my point. We need to be in control of AI, not the other way around.
AbyssalRedemption t1_j65uzka wrote
Well see, this is the thing: if it become even somewhat clear to the public at large that these things were going to inevitably happen, people wouldn’t just sit and take it. You lay off 20% of the work force, and people will riot. Disrupt the current order too quickly and there will be civil unrest, potentially hindering future progress.
jomo666 t1_j61xkfw wrote
AI can’t overthrow us until it’s able to maintain its physical parts independently. Without electricity, conduit maintenance, etc. AI goes to shit quickly without humans.
rypher t1_j61ycwy wrote
Nah man, it wont be humans versus AI like the fucking matrix. It will be extremely rich companies with AI and humans versus the other less fortunate masses. As long as there are humans alive they can be bought with money or ideology.
Viewing a single comment thread. View all comments