Submitted by CMDR_BunBun t3_127j9xn in singularity
175ParkAvenue t1_jeen6a8 wrote
Reply to comment by [deleted] in The Alignment Issue by CMDR_BunBun
A rock also does not have wants and desires. And sure maybe you can make an AI that also does not have wants or desires. But it's not as useful as one that is autonomous and takes actions in the world to achieve some goals. So people will build the AI with wants or desires. Now, when the AI is much smarter than any human it will be very good at achieving goals. This is a problem for us, since we don't have a reliable way to specify some safe goal, and also we have no way to reliably induce some specific goal into an AI. In addition there are strong instrumental pressures on a powerful AI to decieve and use any means to obtain more power and eliminate any possible threats.
Viewing a single comment thread. View all comments