Submitted by kvothekevin t3_1271vpb in Futurology
manicdee33 t1_jec4yb9 wrote
If human labour is not necessary, who actually controls the machines?
What if the machines decide that humans are just an animal like all the other animals, including feeding, care, and various measure to keep the population under control?
What if the actual backstory to Terminator is that Skynet became smarter than us, realised that the human population had grown too large, instituted population control measures such as mandatory birth control with licensed pregnancies, and John Connor's rebels are actually fighting that system because they believe humans should be free to have as many children as they want? The odd act of rebellion escalated to violence escalated to full on thermonuclear war against the environmental vandals.
So IMHO when we get to a post-scarcity utopia it will be because we humans have adapted to all life on Earth including ours being stewarded by the benevolent computer overlords.
Cubey42 t1_jecgkck wrote
Population must've been set to 0 and set the scarity to "nuclear wasteland"
manicdee33 t1_jeci1fa wrote
Nah, there's a level in there somewhere where human population is stable and able to continue being creative and inventive, how cute is it when humans think they've discovered a new law of physics? Awww!
If you go higher they end up over-consuming the renewable resources such as fresh water. If you go lower the population ends up getting inbred or just dying off completely.
Also by managing the human population (and a small number of predator species populations outside the human zone of influence) the rest of the ecosystem manages itself quite handily.
Oh, have you seen what we did with Mars and Venus? The Venusian fjords are just chef's kiss.
Evipicc t1_jee6grk wrote
This feels like baseless fear-mongering to me. The implication that we'd allow a system like this to exist or to exist with that kind of total control over us is bonkers.
acutelychronicpanic t1_jeehe0n wrote
The alignment of machine intelligence must be internal. They have to actually want the same future for humanity that we want and align with our values.
There is no system you could use to harness and control a superintelligence that would be safe. The idea of needing people to control them probably isn't accurate. We won't need to direct them, we'll just need to have preferences.
Viewing a single comment thread. View all comments