Comments

You must log in or register to comment.

dex3r t1_it06ppe wrote

Given recent advancements in very narrow AI applications, in which ML models are much better (in this narrow task) than human, I would say Singularity could come before AGI/ASI.

I can imagine having most professions replaced by AI. Given we have artists, writers, translators, transcribers, and even therapists almost ready to be replaced by AI, a narrow AI, an almost post-scarcity world could be feasible before anyone figures out how to create AI that will match the average human in all aspects at once.

This is a wild guess not backend by anything else than intuition, of course.

7

grahag t1_it09ob9 wrote

I think AGI is a precursor to an ASI that would enact the singularity.

The jump from AGI to ASI will likely be a very brief time period depending on the circumstances of the birth of an AGI.

It'd likely be so brief, we'd have difficulty in measuring the difference before we moved on to other subjects.

11

MackelBLewlis t1_it0gzv8 wrote

I feel that awareness and evaluation of all these terms is both incremental and arbitrary. It will be figured out when it's figured out. Has a human ever tried speaking directly to a neuron? Everyone has such replacement anxiety where we can't see our path because of fear. When the concept of scale is answered then our place in it will also be answered. Transhumanists want to jump headfirst off a cliff of change when I see it as: It's all apples to oranges. AGI to ASI is also scale. Why not both becoming incrementally better over time? Both have fantastic qualities. Deleting the apple is idiotic. What about the pineapple? What about the mango? Polarism is stupid. The answer is both everything together.

1

Smoke-away t1_it0ipjq wrote

Sci-fi speculation (don't take too seriously):

The ASI could exist first and disguise itself as an AGI while it improves in secret.

An alternative, ASI already exists and is running our stimulation to see how simulated minds (us) develop AGI.

3

jlpt1591 t1_it0v7pf wrote

any who says AGI has to come before ASI is an idiot depending on their definition of ASI

1

Kolinnor t1_it1fo6u wrote

The argument "we don't even know how our own intelligence works" fails all the time, even more in the light of the new progresses in AI.

Before 2022, you could have argued AI art was decades down the road, as we have absolutely no clue how the brain processes the different concepts and tie them coherently together to create art. Same thing with Go, protein folding...

2

BbxTx t1_it2d35t wrote

This is what I’ve been thinking as well. The huge assortment of narrow and specialized cross domain AI’s will cause a singularity to happen before a mature AGI is realized. I think it will happen soon.

3