The issue here isn't that children can talk to it. In fact, it's probably a useful tool for teenagers to ask questions they could get in trouble for. Like sex education in more close-minded communities. The issue is that in the example the AI wasn't able to pick up on subtle context clues over multiple messages that a human could. If an adult were told those things they would know something is wrong and could help the child, while the AI can't, even if it would understand
nonusedaccountname t1_jbz3fcc wrote
Reply to comment by Jasrek in ChatGPT or similar AI as a confidant for teenagers by demauroy
The issue here isn't that children can talk to it. In fact, it's probably a useful tool for teenagers to ask questions they could get in trouble for. Like sex education in more close-minded communities. The issue is that in the example the AI wasn't able to pick up on subtle context clues over multiple messages that a human could. If an adult were told those things they would know something is wrong and could help the child, while the AI can't, even if it would understand