Submitted by sigul77 t3_10gxx47 in Futurology
2109dobleston t1_j59mo3o wrote
Reply to comment by DoktoroKiu in How close are we to singularity? Data from MT says very close! by sigul77
The singularity requires sentience and sentience requires emotions and emotions require the physiological.
tangSweat t1_j59zg1i wrote
At what point though if an AI can understand the patterns of human emotion and replicate them perfectly, has memories of its life experiences, forms "opinions" based on the information deemed most credible and has a desire to learn and grow that we say that it is sentient? We set a far lower bar for what is considered sentient in the animal kingdom. It's a genuine philosophical question many are talking about
JorusC t1_j5d6l5w wrote
It reminds me of how people criticize AI art.
"All they do is sample other art, meld a bunch of pieces together into a new idea, and synthetize it as a new piece."
Okay. How is that any different from what we do?
2109dobleston t1_j5avx9t wrote
Sentience is the capacity to experience feelings and sensations.
tangSweat t1_j5deh0t wrote
I understand that, but feelings are just a concept of the human consciousness, they are just a byproduct of our brain trying to protect ourselves from threats back in prehistoric times. If an AGI was using a black box algorithm that we can't access or understand, then how do you differentiate between clusters of transistors or neurones firing in mysterious ways and producing different emotions. AIs like chat gpt are trained with rewards and punishment, and they are coded in a way that they improve themselves, no different really than how we evolved except at a much faster pace
[deleted] t1_j5duat0 wrote
[deleted]
2109dobleston t1_j5duhep wrote
Feelings are a biological act.
DoktoroKiu t1_j5ai7o2 wrote
I would think an AI might only need sapience, though.
Viewing a single comment thread. View all comments