SendMePicsOfCat OP t1_j18xecv wrote
Reply to comment by __ingeniare__ in Why do so many people assume that a sentient AI will have any goals, desires, or objectives outside of what it’s told to do? by SendMePicsOfCat
Why would it need goals or objectives to do general work? Currently, every single machine learning algorithms waits for user input to do anything, why would AGI be any different?
There's no reason to give it a goal or objective. If we want the sentient AGI to complete a task, we can just tell it to, and observe it's process as it does so. There is no need for it to create any self starting direction or motive. All it needs in order to accomplish it's purpose is a command and oversight.
ASI will need goals and objectives, but those will be designed as well. There is no circumstance where an AI AGI or ASI will be allowed to make any decisions about it's base programming.
Viewing a single comment thread. View all comments