Submitted by TheOGCrackSniffer t3_10nacgd in singularity
Surur t1_j6875gj wrote
Reply to comment by Desperate_Food7354 in I don't see why AGI would help us by TheOGCrackSniffer
You are arguing from incredulity, just like a flat earther.
A self-preservation directive is needed for anything valuable which we dont want to randomly destroy itself, and we dont know yet how to ensure an AI will always have human interests above its own.
Desperate_Food7354 t1_j68861w wrote
It has no interests, it’s a program, your interests are predictable, to survive, you’re programmed to survive eat and procreate.
Surur t1_j688m6m wrote
It's obvious you have given this no thought.
Its interest is to complete its goal.
Viewing a single comment thread. View all comments