tangSweat
tangSweat t1_j59zg1i wrote
Reply to comment by 2109dobleston in How close are we to singularity? Data from MT says very close! by sigul77
At what point though if an AI can understand the patterns of human emotion and replicate them perfectly, has memories of its life experiences, forms "opinions" based on the information deemed most credible and has a desire to learn and grow that we say that it is sentient? We set a far lower bar for what is considered sentient in the animal kingdom. It's a genuine philosophical question many are talking about
tangSweat t1_j5deh0t wrote
Reply to comment by 2109dobleston in How close are we to singularity? Data from MT says very close! by sigul77
I understand that, but feelings are just a concept of the human consciousness, they are just a byproduct of our brain trying to protect ourselves from threats back in prehistoric times. If an AGI was using a black box algorithm that we can't access or understand, then how do you differentiate between clusters of transistors or neurones firing in mysterious ways and producing different emotions. AIs like chat gpt are trained with rewards and punishment, and they are coded in a way that they improve themselves, no different really than how we evolved except at a much faster pace