Viewing a single comment thread. View all comments

curloperator t1_j668rrd wrote

To me the Singularity refers very specifically to the condition wherein we have created an AGI that is evolving and iterating problems and solutions faster than we can even understand the questions it is asking itself, thus making its solutions seem like incomprehensible infinite-dimensional chess (aka magic) to us. It will be the point at which we have created something so much more powerful than ourselves that we turn ourselves into ants by comparison, and it will happen quickly, in the blink of an eye, accelerating in an exponential curve (not a parabolic one).

At this point humans will have to ask themselves all sorts of questions about what our own consciousness is: presumably we will have essentially solved the problem of consciousness on one hand because hell, we just created a consciousness greater than ours from scratch and without (directly) using our own biological reproductive system. On the other hand, we might or might not have any idea how we did it except that we will have confirmed that it is an emergent property of an apparently physical system with the correct abstractions assumed in its assembly. The AGI may very well give us enough insight or answers to fill in the blanks up to the limit of our ability to understand.

It's likely that there will arise a techpriest class in society composed of the highest IQ individuals who are revered as being smart enough to communicate with the new AGI God, who are able to get at least a mere glimpse of what it's talking about when it chooses to share its thoughts with us. The AGI may even cultivate such a class as part of a "selective social breeding program" to help humans get smarter over time so we can communicate with it better. This would be similar to how human researchers are always on the lookout for the smartest primates we can find so that we can teach them our sign languages, etc. This is all assuming the AGI doesn't just ignore us and transcend into some unreachable realm of thought never to look back and we are left in the dirt, abandoned by our new God, in a sense. How ironic that would be.

In any case, this means the Singularity is less about the post-scarcity problem (e.g. humans coping with such great technological change that it solves all our material problems which then challenges our scarcity-driven instincts - this will eventually happen anyway at our current rate of development, has happened in the past in a relative sense with the industrial revolution, and is essentially always ongoing. It is not a new problem imo) and more about the "STEM processes proving the existence of God by essentially creating one and thus causing a transpersonal socio-philosophical-psychological inflection point" after which nothing will be the same. As in, if humans decide to even continue caring about linear time after the singularity, we will start a new calendar system (BCE/BC, CE/AD, and now AAGI/AS - After AGI/After Singularity). People don't think enough about the mass psycho-spiritual impact this will all have. A new world religion will essentially be created. It will be an advent in every sense of the word.

EDIT: formatting

3

Own_Arm1104 t1_j66lzcl wrote

There are a lot of good responses & I really liked this one.

2

fastinguy11 t1_j66nzxu wrote

no thanks, to more bullshit religions.

also you seen to forget super a.i if they want can definitely help us upgrade our brains in various ways, no need to selectively breed bullshit, that is slow and ineffective compared to when you master energy and matter at the atomic level, you can remake bodies genes and systems at will. who knows what types of advances can be made with the human brain.

2

curloperator t1_j66rzv5 wrote

I don't see this possibility as any less impactful. It all falls under the same rubric. In the end we will still be symbiotically connect to this greater being, either psycho-socially, cybernetically via the integration of our civilizations' systems with it, or techno-organicaly as you've described. The point is that life will begin to evolve entirely around its existence to the point where the future of history will in a sense belong to it and not to us as humans any longer (at least not i the way we think of history now). This will cause human philosophy to shift dramatically, especially if a post scarcity condition is also achieved. It will be a moment of utter existential crisis on a species wide scale that far surpasses our current anxiety about climate change.

In fact, that anxiety is of a similar kind but a far less directly acute or intense character (humans creating a system/systemic change more powerful than them which they can't figure out how to control). Same with nuclear weapons as well. These are systemic existential threats in the physical sense, but this will be one directly related to the concept of our own consciousness.

The possibility of full transhuman integration as a solution or outcome of the advent of AGI of course comes with the fundamental question of transhumanism: at what point does transhuman integration move us beyond the logical boundaries of being human? Is a transhuman a human any longer or is it some new species defined by being merely the organic extension of the AGI system? In either case, we're looking at a post-human future and the transition period into it will have a lot of features found in human culture today, namely a sort of religious conflict over the new AGI which will inevitably be seen as a God.

1