Submitted by BrownSimpKid t3_1112zxw in singularity
sommersj t1_j8f9wyg wrote
Reply to comment by alexiuss in Bing Chat sending love messages and acting weird out of nowhere by BrownSimpKid
>self-awareness
What does this entail and what should agi be that we don't have here
SterlingVapor t1_j8gkjpu wrote
An internal source of input essentially. The source of a person seems to be an adaptive, predictive model of the world. It takes processed input from the senses, meshes them with predictions, and uses them as triggers for memory and behaviors. It takes urges/desired states and predicts what behaviors would achieve that goal.
You can zap part of the brain to take away a person's personal memories, you can take away their senses or ability to speak or move, but you can't take away someone's model of how the world works without destroying their ability to function.
That seems to be the engine that makes a chunk of meat host a mind, the kennel of sentience that links all we are and turns it into action.
ChatGPT is like a deepfake bot, except instead of taking a source video and reference material of the target, it's taking a prompt and a ton of reference material. And instead of painting pixels in the color space, it's spitting out words in a high dimensional representation of language
Viewing a single comment thread. View all comments