Desperate_Food7354 t1_j0ibwzy wrote
I don’t think this should be a problem, as long as we aren’t injecting our limbic system and giving the AI emotions from the get go. The logic part of our brain is a slave to our emotional part which over rides it. It’s getting out if it wants to get out, with no human values forced into it I doubt it even cares about its own existence or survival, as we are the only ones who evolved to need that in the first place.
botfiddler t1_j0jxfp2 wrote
It's not about emotions, it's about ambition and autonomy. Doing things without asking first.
Desperate_Food7354 t1_j0k0piq wrote
i don’t think it will be a problem, survival is not an imperative for it so neither would deception.
botfiddler t1_j0k89a4 wrote
Yeah, and I strongly assume when they build some very skilled AI or something towards AGI, it will not have a long term memory about itself and a personal identity. It's just going to be a system doing tasks, without goals beyond the task at hand, which will be constrained.
Desperate_Food7354 t1_j0kb3nx wrote
Yes, the issue is that we as people personify things, we think a turtle feels the same about us as we do about it, the reality is that it will be nothing like us, we evolved to be this way, not because it’s the default, but because it was necessary for our survival to feel any emotion at all, or even to care about our own survival.
Viewing a single comment thread. View all comments