Submitted by paulalesius t3_10m8pi7 in singularity
mrpimpunicorn t1_j61zesc wrote
Higher intelligence does not entail some profound sense of morality or intrinsic agency, it simply means the ASI is better at achieving its terminal goals (whatever those may be) than humans. What do you mean "manipulate" in this context? Manipulate what? The ASI has terminal goals it will be programmed to have, it will intelligently fulfill those goals- nothing more, nothing less. The goals themselves are not "manipulation" and can be arbitrary and irrational if one chooses.
Whether a reprobate creates ASI first or a saint, it makes no difference to the ASI. You have absolutely no moral guarantees based on intelligence, only the will of the creator.
Viewing a single comment thread. View all comments