Viewing a single comment thread. View all comments

ComplicitSnake34 t1_j7976y5 wrote

I don't fault anyone for doing that tbh. A lot of people already feel disillusioned from society because of the lack of effective politics and community. Still, it'd be a sad existence to live in private with other people who ultimately felt their innately human ideas didn't matter for the rest of society.

The alternative of a "singularity" (whatever that means there's a billion definitions) seems like a mixed bag.

Best case scenario, if labor can be entirely automated then the only "job" would be political discussion and philosophy, and "nations" would dictate what direction to steer the AI in (space exploration? gene editing? psychedelic exploration?) People would still do their chores and hobbies, and maybe they'll treat them as "jobs", but ultimately everyone would be working for the "government" out of necessity. People won't "work" and instead would have all the time in the world to contemplate and use human intellect in expanding their own humanity. This hypothetical is if AI is treated as a tool rather than a savior.

The worst case scenario would be a hivemind-esque AI system people are plugged into. Where the private sphere has entirely banished and any differentiation (humanity) is erased. These "Humans" would have transcended their bodies and be floating minds operating within an AI-fueled digital/physical space which has full control. By its whims, the Ai could easily determine which minds are to be erased because of their """possible""" harm to others. Inevitably it'd result in a whittling down of humanity into a single animal hivemind where individuals are interchangeable. A benevolent AI's mistaken goal for preserving humanity.

8

Smellz_Of_Elderberry t1_j7b2kd1 wrote

My favorite book on this subject is blindsight by Peter watts. It predicts many different sects of mankind, some chose simply to enter a state where their brains are in a permanent state of extreme meditation, then go into stasis. They were the remnants of meditative culture.

3

Frumpagumpus t1_j7bm6rk wrote

i think your worst case scenario is actually the best case scenario but I dont think you've really put much thought or justification into some of the properties you think that scenario will have.

I harp on these points in like every other comment, but here we go again...

> hive mind

no, the importance of data locality to computation means intelligence and especially self awareness will be NECESSARILY distributed, however, the extreme speedup in communication/thinking, maybe a million times faster, MIGHT (maybe probably would) mean that to humans, it would seem like a hive mind.

> the Ai could easily determine which minds are to be erased

my take is that post human intelligences will intentionally copy and erase themselves because it is convenient to do so. Human take on life and death is a cultural value associated with our brains being anchored to our bodies.

my guess would be that most of this copying and erasing would occur under one's own will. Obviously computer viruses would become analogous to a much much more dangerous version of modern biological viruses. However if I had to bet, while bad stuff would happen I would bet it would happen at a rate less than bad stuff currently happens at a population level in our society (any given individual would be much less likely to die in an accident or disaster).

1

gangstasadvocate t1_j7bk56c wrote

Gang gang I’m hoping and advocating for the first outcome with the psychedelic exploration

−1