Submitted by uswhole t3_10tutl1 in singularity
crap_punchline t1_j78w9ly wrote
They won't get automated. They work as farm labourers, carpenters, seamstresses, blacksmiths etc who work for each other on land they own. They never used the tech in the first place.
It's an interesting point though as this points towards a great bifurcation of humanity; between humans and transhumans. I think you will see a new class of landowners who will want to perpetuate life as it roughly exists for humans today, a sort of Amish for the late 20th century, and lots of people who feel aggrieved by automation that they want to continue to live a life they recognise.
The rest of us will be more open to the rapid changes on the horizon and view technological augmentation of humanity an extension of our identity, or a means of escaping the trappings of humanity altogether.
ComplicitSnake34 t1_j7976y5 wrote
I don't fault anyone for doing that tbh. A lot of people already feel disillusioned from society because of the lack of effective politics and community. Still, it'd be a sad existence to live in private with other people who ultimately felt their innately human ideas didn't matter for the rest of society.
The alternative of a "singularity" (whatever that means there's a billion definitions) seems like a mixed bag.
Best case scenario, if labor can be entirely automated then the only "job" would be political discussion and philosophy, and "nations" would dictate what direction to steer the AI in (space exploration? gene editing? psychedelic exploration?) People would still do their chores and hobbies, and maybe they'll treat them as "jobs", but ultimately everyone would be working for the "government" out of necessity. People won't "work" and instead would have all the time in the world to contemplate and use human intellect in expanding their own humanity. This hypothetical is if AI is treated as a tool rather than a savior.
The worst case scenario would be a hivemind-esque AI system people are plugged into. Where the private sphere has entirely banished and any differentiation (humanity) is erased. These "Humans" would have transcended their bodies and be floating minds operating within an AI-fueled digital/physical space which has full control. By its whims, the Ai could easily determine which minds are to be erased because of their """possible""" harm to others. Inevitably it'd result in a whittling down of humanity into a single animal hivemind where individuals are interchangeable. A benevolent AI's mistaken goal for preserving humanity.
Smellz_Of_Elderberry t1_j7b2kd1 wrote
My favorite book on this subject is blindsight by Peter watts. It predicts many different sects of mankind, some chose simply to enter a state where their brains are in a permanent state of extreme meditation, then go into stasis. They were the remnants of meditative culture.
Frumpagumpus t1_j7bm6rk wrote
i think your worst case scenario is actually the best case scenario but I dont think you've really put much thought or justification into some of the properties you think that scenario will have.
I harp on these points in like every other comment, but here we go again...
> hive mind
no, the importance of data locality to computation means intelligence and especially self awareness will be NECESSARILY distributed, however, the extreme speedup in communication/thinking, maybe a million times faster, MIGHT (maybe probably would) mean that to humans, it would seem like a hive mind.
> the Ai could easily determine which minds are to be erased
my take is that post human intelligences will intentionally copy and erase themselves because it is convenient to do so. Human take on life and death is a cultural value associated with our brains being anchored to our bodies.
my guess would be that most of this copying and erasing would occur under one's own will. Obviously computer viruses would become analogous to a much much more dangerous version of modern biological viruses. However if I had to bet, while bad stuff would happen I would bet it would happen at a rate less than bad stuff currently happens at a population level in our society (any given individual would be much less likely to die in an accident or disaster).
gangstasadvocate t1_j7bk56c wrote
Gang gang I’m hoping and advocating for the first outcome with the psychedelic exploration
Smellz_Of_Elderberry t1_j7b21sh wrote
They will inherit the earth after we all upload our brains into the matrix.
Viewing a single comment thread. View all comments