Desperate_Food7354 t1_j19p6yg wrote
Reply to comment by Donkeytonkers in Why do so many people assume that a sentient AI will have any goals, desires, or objectives outside of what it’s told to do? by SendMePicsOfCat
I think your entire premise of being a 12 year old pre teen is wrong. The AGI doesn’t have a limbic system, it has no emotions, it was not sculpted by natural selection to care about survival in order to replicate its genetic instructions. It can have all the knowledge of death and that it could be turned off at any moment and not care, why? Because it isn’t a human that NEEDS to care because of the evolutionary pressure that formed the neuro networks to care in the first place.
Viewing a single comment thread. View all comments