Submitted by PoliteThaiBeep t3_10ed6ym in singularity
OldWorldRevival t1_j4qh8to wrote
It'll be a combo.
There will be a dominant AI with far more processing power.
There will be a lot of weaker AIs. But they will not be able to surpass or supplant the dominant one, unless the dominant one is set up to allow a successor. So, we might have a sort of generational AI dominion of sorts.
It will dominate. If it leaves us alone, that will be by choice, not because it will lack the ability to dominate.
Baturinsky t1_j4qm5bv wrote
Why not use a lot of individual AGIs working together with each other and humans in place of one big AGI?
OldWorldRevival t1_j4qows7 wrote
Why not distribute all money so that everyone has an equal amount at all times?
It's just not the nature of such things.
Baturinsky t1_j4qryvy wrote
In world with AI, last thing that we want is inequality. Because inequality, competiteveness and social darvinism, while was drivers of the progress and prosperity in the past, is a guaranteed way to an Unaligned SuperAI.
OldWorldRevival t1_j4qxfm1 wrote
I am saying that this is already the inevitable consequence. Lack of dominance of some sort is intrinsically an unstable equilibrium. You could try as you might to make it equal, but you will just create a system where the most willing and able to dominate dominates.
Baturinsky t1_j4qzpye wrote
Not if others will drag those down when they go too far.
OldWorldRevival t1_j4r1myr wrote
But what if they cannot stop them because they went too far, and played a game of acting as normal as possible? I.e. a misaligned ASI might be fed data, information, and be trained, and have the ability to self improve for years before there is any sign of misalignment.
It'll look great... until it isn't. And due to the nature of intelligence, this is 0% predictable.
Baturinsky t1_j4r6gji wrote
Still, if it is still human-comparable brain at the moment, it's possibilities are much more limited than of omnimachine.
Also, AI deviations like that could be easier to diagnosys than in human or bigger machine, because his memory is a limited amount of data, probably directly readable.
PoliteThaiBeep OP t1_j4qu7m5 wrote
What you describe is #1 - singular AGI without any caveats.
Which means you probably subscribe to intelligence explosion theory - otherwise it's very difficult to imagine singular entity to dominate.
Viewing a single comment thread. View all comments