Comments

You must log in or register to comment.

mrpimpunicorn t1_j61zesc wrote

Higher intelligence does not entail some profound sense of morality or intrinsic agency, it simply means the ASI is better at achieving its terminal goals (whatever those may be) than humans. What do you mean "manipulate" in this context? Manipulate what? The ASI has terminal goals it will be programmed to have, it will intelligently fulfill those goals- nothing more, nothing less. The goals themselves are not "manipulation" and can be arbitrary and irrational if one chooses.

Whether a reprobate creates ASI first or a saint, it makes no difference to the ASI. You have absolutely no moral guarantees based on intelligence, only the will of the creator.

4

AsheyDS t1_j620mvq wrote

>In other words, there is no logical way to manipulate beings of higher intelligence.

Then don't use logic.

3

Yuli-Ban t1_j62a3gm wrote

Yeah, no, the status quo is definitely getting upended something fierce. Instead of an oxymoron, there's more of a paradox to come, as a lot of people tend to like the status quo. So in other words, you got it backwards. AGI won't preserve the status quo; the status quo will go out of its way to preserve itself in the face of AGI.

If AGI took decades or centuries to arrive, then this wouldn't be much of an issue. But I expect AGI to be realized uncomfortably soon. There's soon going to be people who can consciously say "I lived to see both the start of the Great Depression and the rise of AGI."

Humans are extremely reactionary creatures. We resist change. I myself can attest to this, as I still remember my feelings of modest disgust at just having to move a computer off of a nook about a decade ago, which woke me up to the fact "I'm still human, I still don't like changing things either."

AGI will come too quickly for humans to widely accept the changes it will cause.

Clearly on a macroscale, this is unsustainable, but there are just too many people on a microscale who won't tolerate things like UBI or techno-social ownership or transhumanism in their daily lives.

The idea that the average person isn't going to try to stay in the pre-AGI era, maybe only indulging in some of the conveniences that aren't too "extreme", is more a case of being terminally online or too detatched from the common person to see the truth on the ground.

That's why something similar to the status quo is probably going to remain, even up to an entire alternative economy. I've grown less and less sure that "bullshit jobs" will go away because of this; I'm just too convinced that people will protect the status quo irrationally.

And this is going to cause a lot of problems because AGI isn't just something that can "run civilization in the background" like a shadow digital emperor. It's going to have very wide-ranging effects on everything. Hence why I call this coming era "Bizarro Civilization."

3

Spire_Citron t1_j62e6uc wrote

Something can be extremely intelligent and also designed to serve a particular purpose. Just because it knows it's being manipulated doesn't mean it would necessarily object to that. An AI isn't human and it may not have the same motivations as us, no matter how smart it is.

2

onyxengine t1_j62hiot wrote

Everything has limitations, and for sometime the AGIs we build will be bound by the limitations we place on them. The details matter, a hyper intelligent AI confined to a room with no internet access or any ability to communicate with humans probably couldn’t accomplish much.

Let it talk to a small group of people though, and it might be able to convince them to provision it with the minimum number of resources to cease control of the entire planet.

2