Viewing a single comment thread. View all comments

3SquirrelsinaCoat t1_jdqqnux wrote

So long as we talk about AI using words and concepts typically only applied to living things, then I think there's truth in what you say, but maybe for different reasons.

Of course AI does not experience anything but the way we talk about it, sometimes, suggests that it is experiencing. We use words like "think" and "learn." We talk about, "it told me X" or "it discovered X." Then we add conversational AI to give it a personality, we give it a voice through text to audio. Robots are often humanoid. And all that before the people who don't understand this technology at all come rushing in and perceive an AI-self because they lack the technical knowledge to know that that isn't so.

We are definitely on a trajectory to treat AI as if it is autonomous and "deserving" of rights, but that's not because AI is becoming so sophisticated that it justifies that. Instead, because it is becoming so sophisticated and because we talk about it using human-specific verbs, I do think a large portion of end users will simply view AI as human-like, regardless of the truth of it. That is, AI rights will grow out of ignorance and humans anthropomorphizing inanimate computations.

We can change this. If the AI field started purposefully rejecting human-specific verbs, and if journalists stopped being so superficial and dumbing it down, and if we can improve social media conversations where there are often ignorant people proclaiming that AI is sentient, and if government bodies codify how the law views AI and that it is neither human nor deserving of any legal status beyond technology regulation - if we do all that, we can get people on the same page about what AI is and how it works. But I'm not holding my breath.

0

Odd_Dimension_4069 OP t1_jee0drv wrote

Yeah look that's a good suggestion for part of a solution for this problem, which, by the way, I think is precisely the same problem I was talking about. Maybe I didn't clarify this enough, but I was entirely talking about the fact that people are stupid, and because of those stupid people, AI rights will be necessary before they ever become sophisticated enough to prove they deserve them.

I like your idea, but I feel like media outlets are going to continue to use humanizing language to make articles about AI more 'clickable'.

1