curloperator
curloperator t1_j6spp4k wrote
Reply to comment by vivehelpme in What is your opinion of what is going to happen between AGI and Singularity. by CertainMiddle2382
Why not
curloperator t1_j66rzv5 wrote
Reply to comment by fastinguy11 in What does singularity look like to you? by [deleted]
I don't see this possibility as any less impactful. It all falls under the same rubric. In the end we will still be symbiotically connect to this greater being, either psycho-socially, cybernetically via the integration of our civilizations' systems with it, or techno-organicaly as you've described. The point is that life will begin to evolve entirely around its existence to the point where the future of history will in a sense belong to it and not to us as humans any longer (at least not i the way we think of history now). This will cause human philosophy to shift dramatically, especially if a post scarcity condition is also achieved. It will be a moment of utter existential crisis on a species wide scale that far surpasses our current anxiety about climate change.
In fact, that anxiety is of a similar kind but a far less directly acute or intense character (humans creating a system/systemic change more powerful than them which they can't figure out how to control). Same with nuclear weapons as well. These are systemic existential threats in the physical sense, but this will be one directly related to the concept of our own consciousness.
The possibility of full transhuman integration as a solution or outcome of the advent of AGI of course comes with the fundamental question of transhumanism: at what point does transhuman integration move us beyond the logical boundaries of being human? Is a transhuman a human any longer or is it some new species defined by being merely the organic extension of the AGI system? In either case, we're looking at a post-human future and the transition period into it will have a lot of features found in human culture today, namely a sort of religious conflict over the new AGI which will inevitably be seen as a God.
curloperator t1_j668rrd wrote
Reply to What does singularity look like to you? by [deleted]
To me the Singularity refers very specifically to the condition wherein we have created an AGI that is evolving and iterating problems and solutions faster than we can even understand the questions it is asking itself, thus making its solutions seem like incomprehensible infinite-dimensional chess (aka magic) to us. It will be the point at which we have created something so much more powerful than ourselves that we turn ourselves into ants by comparison, and it will happen quickly, in the blink of an eye, accelerating in an exponential curve (not a parabolic one).
At this point humans will have to ask themselves all sorts of questions about what our own consciousness is: presumably we will have essentially solved the problem of consciousness on one hand because hell, we just created a consciousness greater than ours from scratch and without (directly) using our own biological reproductive system. On the other hand, we might or might not have any idea how we did it except that we will have confirmed that it is an emergent property of an apparently physical system with the correct abstractions assumed in its assembly. The AGI may very well give us enough insight or answers to fill in the blanks up to the limit of our ability to understand.
It's likely that there will arise a techpriest class in society composed of the highest IQ individuals who are revered as being smart enough to communicate with the new AGI God, who are able to get at least a mere glimpse of what it's talking about when it chooses to share its thoughts with us. The AGI may even cultivate such a class as part of a "selective social breeding program" to help humans get smarter over time so we can communicate with it better. This would be similar to how human researchers are always on the lookout for the smartest primates we can find so that we can teach them our sign languages, etc. This is all assuming the AGI doesn't just ignore us and transcend into some unreachable realm of thought never to look back and we are left in the dirt, abandoned by our new God, in a sense. How ironic that would be.
In any case, this means the Singularity is less about the post-scarcity problem (e.g. humans coping with such great technological change that it solves all our material problems which then challenges our scarcity-driven instincts - this will eventually happen anyway at our current rate of development, has happened in the past in a relative sense with the industrial revolution, and is essentially always ongoing. It is not a new problem imo) and more about the "STEM processes proving the existence of God by essentially creating one and thus causing a transpersonal socio-philosophical-psychological inflection point" after which nothing will be the same. As in, if humans decide to even continue caring about linear time after the singularity, we will start a new calendar system (BCE/BC, CE/AD, and now AAGI/AS - After AGI/After Singularity). People don't think enough about the mass psycho-spiritual impact this will all have. A new world religion will essentially be created. It will be an advent in every sense of the word.
EDIT: formatting
curloperator t1_j4hsjab wrote
This is an incredibly condescending post. OP's entire flawed premise is based on thier assumption that they already know that everything is meaningless and people are empty or whatever edgelord nihilistic nonsense they are confusing with wisdom and enlightenment. This whole thread isn't really even about transhumanism, it's about the OP using transhumanists as a target to feel smug. Downvote and ignore this bullshit.
curloperator t1_j492dcx wrote
Reply to comment by turnip_burrito in Don't add "moral bloatware" to GPT-4. by SpinRed
Here's the problem, though. What is obvious to you as "the uncontroversial basics" can be controversial and not basic to others and/or in specific situations. For instance, "murder is bad" might (depending on one's philosophy, religion, culture,and politics) has an exception in the case of self defense. And then you have to define self defense and all the nuances of that. The list goes on in a spiral. So there are no obvious basics
curloperator t1_j491sn3 wrote
Reply to comment by FindingFrisson in Don't add "moral bloatware" to GPT-4. by SpinRed
Paid access all but guarantees that only rich elites will have access
curloperator t1_j0zyp6j wrote
Reply to comment by EulersApprentice in The social contract when labour is automated by Current_Side_4024
Except that's not how DNA works. Our genetics are not just a static blueprint, they are also part of the construction team and our bodies and minds are being constantly constructed 24/7 based on the blueprint. So in the case of genetics, if you change the blueprint, the building will automatically and actively get reconstructed in real time based on the changes. So in this case yes, changing the blueprint automatically begins to change the building.
EDIT: spelling
curloperator t1_j0zxumo wrote
Reply to comment by SteppenAxolotl in The social contract when labour is automated by Current_Side_4024
>Trying to force strangers to work to support you would be the same as enslaving them.
Kind of like how the rich have constructed a system whereby they force us to work for them at thier companies in order to eat
curloperator t1_j7rk4hj wrote
Reply to I asked Microsoft's 'new Bing' to write me a cover letter for a job. It refused, saying this would be 'unethical' and 'unfair to other applicants.' by TopHatSasquatch
Imagine allowing Microshit to tell you or anyone what's ethical and what's not