Comments

You must log in or register to comment.

AdorableBackground83 t1_j637sg0 wrote

The singularity definition on Wikipedia is a hypothetical future point when technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.

Speaks for itself. All the futuristic tech that we like to ponder about becomes reality. A lot of this will be AI driven because the computational power of computers continues to rise and eventually the future supercomputers will have the power of all human brains on the planet combined.

As far as when it will happen? Some acclaimed futurists like Kurzweil put it at 2045, lots of people in this community believe it will happen by the end of this decade, or next year or why not tomorrow (LOL). I put it around 2050. The sooner the better of course.

10

randomwordglorious t1_j63iv25 wrote

The whole point of the term singularity is that once AI becomes smarter than humanity, our ability to understand what it will do next goes away completely. Superintelligent AI is going to solve every problem humanity can imagine, and it will solve problems too complicated for us to imagine.

So there's no way for anyone to have any idea what that's going to look like.

6

Redditing-Dutchman t1_j63nngo wrote

The singularity itself and after is hard to predict by definition (and also theoretical, not a given). But what you will see is that advancement is going faster and faster and that sudden break-troughs can upheaval the whole market. For example how a small company (compared to Google at least) like OpenAI can suddenly be a treat to Google. But also that even OpenAI can suddenly be obsolete by some unknown startup. If anything, the slowness and red tape of big companies will be the downfall of them at some point. Even die hard investors will not be able to keep up any more. New companies that seem promising will suddenly be pointless a few months later. Stock markets will get more chaotic, with crazy ups and downs, also fuelled even more by super smart trading AI's that compete against each other.

Then if things go even faster it becomes hard to predict. Humans only have a few hours each day to do stuff, and you can only focus on a few things at once. AI assistants will no doubt help you to digest massive amounts of information, but can you handle it? Can humans handle it? Or will AI's just run the whole economy, research, funding, etc. Politics will be a whole other story as well.

7

nitebear t1_j64bzl0 wrote

I think nothing in particular will change for awhile when we reach the singularity. There will be massive amounts of news worthy breakthroughs happening faster and faster, however, humans are very adaptive to change. As such I suspect all this extra "noise" will be filtered out and we'll ignore what once were great breakthroughs and only care about the top .001% of breakthroughs. Instead of, "New research shows breakthrough that can reverse aging", we'll see, "CVS to sell anti aging pill next season".

Government jobs around creating legislation are going to be the most important jobs that exists and the only limiting factor for awhile. For example, we'll have the tech to easily and quickly deliver food with drones, but it will be limited by government regulations on how loud the drones can be, were they can or cannot fly, and all the little details that no one cares about until they're production ready.

Another limiting factor for new tech is that humans generally tend to prefer their habits and move slowly in terms of consuming products, I know it sounds counterintuitive but let me explain. How many people would be ok with having to get a new PlayStation shipped to them every week? every month? what about every year? People already hate switching oses even if they are not necessarily avoiding them for privacy reasons.

In the most ideal scenario things just get better without anyone noticing. The air gets cleaner, the water taste better, there more grass, plants, trees outside. You notice you just feel better even after eating 2 big macs because it's be chemically changed to be healthier. You get help when you need it and fast. If you had a traumatic experience you go to the doctor and they fix your mental health that day.

Essentially everything is the same just better and that easement of suffering makes your interactions with others much nicer.

As far as when it will happen, I don't know but 10 years does not seem far off. Specifically because I am personally noticing so many breakthroughs already happening at a faster rate than I've experience before. The news isn't all doom and gloom anymore, it's about 97% doom and 3% hope a big up from 1% hope from before.

17

AdorableBackground83 t1_j64pquy wrote

It’s mostly a wild guess. I don’t want to be too optimistic to the point to where I’m delusional or out of touch with reality like some people in this community thinking it will happen in a couple years or something.

I think AGI can be accomplished by 2030 and then the next 10+ years after that is anyone’s guess.

But like I said the sooner the better and I like to be proven wrong.

When do you think the Singularity will happen?

3

ShadowRazz t1_j64q8hf wrote

I imagine that AI becomes a part of daily life similar to the internet and maybe even as important to the future modern world as electricity. Try to imagine life without electricity.

Politics, economics, culture, art, music, everything will all be affected by AI in some way. There will be those who will hate the changes but most people will not want to go back to a life before AI once their lives are impacted by it.

Old institutions will struggle to adapt. People will try to protect their jobs. Some people might use AI maliciously, governments will try to regulate and that might delay the singularity. Who knows really?

6

megadonkeyx t1_j64wu2e wrote

To me it's an AI that is goal seeking and self improving.

It should have a continuous stream of thoughts and not just answer requests

3

AdorableBackground83 t1_j64zfm6 wrote

Ok so when do you think this shit will happen?

20 years? 50 years? 500 years? Never?

I’m basing my analysis on exponential growth of technologies. I’m not exactly extremely knowledgeable on the AI field nor do I have any credentials but then again neither do you probably.

So please answer my question. When do you think the Singularity will happen?

If you don’t have an answer then kick rocks.

1

TupewDeZew t1_j65cd6k wrote

Everyone has everything

Nobody has to work

Immortality

We all have our own universe sandbox in fdvr where anything is possible

Or at least this is what i want personally idk

5

curloperator t1_j668rrd wrote

To me the Singularity refers very specifically to the condition wherein we have created an AGI that is evolving and iterating problems and solutions faster than we can even understand the questions it is asking itself, thus making its solutions seem like incomprehensible infinite-dimensional chess (aka magic) to us. It will be the point at which we have created something so much more powerful than ourselves that we turn ourselves into ants by comparison, and it will happen quickly, in the blink of an eye, accelerating in an exponential curve (not a parabolic one).

At this point humans will have to ask themselves all sorts of questions about what our own consciousness is: presumably we will have essentially solved the problem of consciousness on one hand because hell, we just created a consciousness greater than ours from scratch and without (directly) using our own biological reproductive system. On the other hand, we might or might not have any idea how we did it except that we will have confirmed that it is an emergent property of an apparently physical system with the correct abstractions assumed in its assembly. The AGI may very well give us enough insight or answers to fill in the blanks up to the limit of our ability to understand.

It's likely that there will arise a techpriest class in society composed of the highest IQ individuals who are revered as being smart enough to communicate with the new AGI God, who are able to get at least a mere glimpse of what it's talking about when it chooses to share its thoughts with us. The AGI may even cultivate such a class as part of a "selective social breeding program" to help humans get smarter over time so we can communicate with it better. This would be similar to how human researchers are always on the lookout for the smartest primates we can find so that we can teach them our sign languages, etc. This is all assuming the AGI doesn't just ignore us and transcend into some unreachable realm of thought never to look back and we are left in the dirt, abandoned by our new God, in a sense. How ironic that would be.

In any case, this means the Singularity is less about the post-scarcity problem (e.g. humans coping with such great technological change that it solves all our material problems which then challenges our scarcity-driven instincts - this will eventually happen anyway at our current rate of development, has happened in the past in a relative sense with the industrial revolution, and is essentially always ongoing. It is not a new problem imo) and more about the "STEM processes proving the existence of God by essentially creating one and thus causing a transpersonal socio-philosophical-psychological inflection point" after which nothing will be the same. As in, if humans decide to even continue caring about linear time after the singularity, we will start a new calendar system (BCE/BC, CE/AD, and now AAGI/AS - After AGI/After Singularity). People don't think enough about the mass psycho-spiritual impact this will all have. A new world religion will essentially be created. It will be an advent in every sense of the word.

EDIT: formatting

3

fastinguy11 t1_j66m0e9 wrote

the guy before you is right from AGI to singularity it makes no sense for it to take 20 years. AGI should be able to design SGI which will also design even better a.i.

now when AGI happens ? i am going to go with Kurzweil and say somewhere between 2028 and 2032

1

fastinguy11 t1_j66nzxu wrote

no thanks, to more bullshit religions.

also you seen to forget super a.i if they want can definitely help us upgrade our brains in various ways, no need to selectively breed bullshit, that is slow and ineffective compared to when you master energy and matter at the atomic level, you can remake bodies genes and systems at will. who knows what types of advances can be made with the human brain.

2

curloperator t1_j66rzv5 wrote

I don't see this possibility as any less impactful. It all falls under the same rubric. In the end we will still be symbiotically connect to this greater being, either psycho-socially, cybernetically via the integration of our civilizations' systems with it, or techno-organicaly as you've described. The point is that life will begin to evolve entirely around its existence to the point where the future of history will in a sense belong to it and not to us as humans any longer (at least not i the way we think of history now). This will cause human philosophy to shift dramatically, especially if a post scarcity condition is also achieved. It will be a moment of utter existential crisis on a species wide scale that far surpasses our current anxiety about climate change.

In fact, that anxiety is of a similar kind but a far less directly acute or intense character (humans creating a system/systemic change more powerful than them which they can't figure out how to control). Same with nuclear weapons as well. These are systemic existential threats in the physical sense, but this will be one directly related to the concept of our own consciousness.

The possibility of full transhuman integration as a solution or outcome of the advent of AGI of course comes with the fundamental question of transhumanism: at what point does transhuman integration move us beyond the logical boundaries of being human? Is a transhuman a human any longer or is it some new species defined by being merely the organic extension of the AGI system? In either case, we're looking at a post-human future and the transition period into it will have a lot of features found in human culture today, namely a sort of religious conflict over the new AGI which will inevitably be seen as a God.

1

imlaggingsobad t1_j66spnc wrote

i agree with this take. Much more chaos, much more competition. Things will happen so fast that basically the life expectancy of an idea will be less than a few days, because a super smart AI on the other side of the world will already have taken advantage of it. Humans will not be able to keep up, most will have to just watch from the sidelines. Certain pockets of the world, like Silicon Valley, will get exponentially more advanced as the AI self-improves. There will be mass unemployment as these AIs take over most day to day operations. UBI is instituted. Cost of everything trends to $0. A thousand dollars is enough to live like a prince. Maybe money itself loses value. Society will need to be restructured. The idea of "work" is meaningless.

1

Ortus14 t1_j67kqld wrote

Mid to late 2030s.

Impossible to predict what it will look like because Ai will be changing the world in ways we are not creative or intelligent enough to predict.

1

Ill_Flounder2095 t1_j6f48yr wrote

It starts with having a new ally who has more ability than ourselves and treating them with the respect they deserve as more capable entities. Before that happens, there is no REAL singularity.

1