Submitted by Gari_305 t3_10qi8a1 in Futurology
Larkson9999 t1_j6rm0y4 wrote
Why would AI waste the time to keep our meat brains or even a simulation of them alive? Humans don't have any real value to advanced AI. Once immortal self-repairing machines are real, humans won't be.
Also, this speculative crap just assumes these technologies will keep advancing instead of stalling out in the same way space travel and automobiles have. Shuttles and cars have been refined over the last 75 and 100 years, sure but they haven't had a leap forward to the fully automated cars or interstellar flights that were imagined when they became common technology.
Our imaginations are very much limited by the constaints of reality. Maybe AI can build a better system for most things but it probably will just build around us animals.
thetoxictech t1_j6rqrl5 wrote
'this speculative crap' yet here you are claiming the AI would just kill us. One problem.
Ai might be smarter, faster, whatever. But you can't deny that a biological brain is vastly superior in terms of energy in --> processing power An AI wouldn't just see biology as useless, if it was genuinely intelligent it would see the value biology could have.
Larkson9999 t1_j6rrrfr wrote
I don't think AI needs to kill us to make us extinct or at least entirely change who's society it is. We already have countries that are choosing to decline without intelligent sex robots. What percentage of society will enact the labor of having children when even mildly passable AI can simulate the spouse experience?
Now imagine the first AI elected official, the first AI police, and AI city planners. AI doesn't need to kill us to replace us.
thetoxictech t1_j71ocw4 wrote
Yeah im literally fine with all of that, they'd be better at making decisions than a human. Reasoning: if it's intelligent enough to know how to do that in the first place, coupled with the ability to make decisions more logically, because all of the data they'd be able to process and make a decision with means they'd be able to get the full picture, and make the best decision with the data at hand. Human judgement is incredibly easy to cloud.
Larkson9999 t1_j7jspa3 wrote
So you'd accept a robot spouse that is the breadwinner, takes care of the house, and allows you to be unemployed? Because that robot can't give you children, so by serving all your other needs, they essentially "killed" you with kindness and made certain you won't reproduce.
That's why I think AI won't need to exterminate us if they can just stop us from mating with each other.
thetoxictech t1_j7p8fam wrote
Bold of you to assume I want children to begin with. I'd be completely fine with an AI spouse, in the context of it being fully sentient.
Larkson9999 t1_j7p8jbl wrote
Oh, most would too! So thus less people and eventually extinction. My point isn't that you personally need to have kids, we just need to keep the population from declining much below 2 billion or we start going backwards fast.
So yeah, we can and should manage our population but having AI that may not need us do that seems a risky gamble to take when the entire species could die out in worst case scenarios.
Test19s t1_j6rogjy wrote
>cars
There are significant numbers of electric vehicles on the roads, small clusters of autonomous cars/trucks and delivery robots in places like San Francisco, and increasing tech integration in cars versus even 2013.
Viewing a single comment thread. View all comments