Viewing a single comment thread. View all comments

dmit0820 t1_j6g0pkd wrote

Some of that might not be too hard, self-awareness and agency can be represented as text. If you give Chat GPT a text adventure game it can respond as though it has agency and self-awareness. It will tell you what it wants to do, how it wants to do it, explain motivations, ect. Character. AI takes this to another level, where the AI bots actually "believe" they are those characters, and seem very aware and intelligent.

We could end up creating a system that acts sentient in every way and even argues convincingly that it is, but isn't.

3

StevenVincentOne t1_j6g1ixu wrote

Sure. But I was talking about creating systems that actually are sentient and agentic not just simulacra. Though one could discuss whether or not for all practical purposes it matters. If you can’t tell the difference does it really matter as they used to say in Westworld.

5