Submitted by demauroy t3_11pimea in Futurology
While I was helping my youngest daughter this morning doing her homework, I let my oldest daughter (young teenager) use my computer to discuss with ChatGPT. I had a look at the transcript after, and I was very surprised that after a few minutes, she started talking with ChatGPT about her problems (with her parents mostly, and also a little bit with her friends), and she continued chatting for 30 minutes or so.
They seemed to have a real conversation (much more interactive than reading advice on the internet), and the advice given, while not very original, was of very decent quality and quite fine-tuned to her situation. She was completely hooked.
I believe teenagers ideally need an 'adult confidant' who is neither a parent nor a teacher, to get advice on life. Typically, a grand-father or uncle / aunt can play this role, and also sport coaches. In the catholic church, confession can also sometimes play this role.
it is important in my opinion to have this adult trusted voice in complement to the discussion with peer teenagers who often do not have the suitable experience to answer questions.
Now, as such adult role models are not always available (and some adults may also setup traps to teenagers acting as a confidant), I feel ChatGPT or comparable AIs could fill an important role in helping teenagers getting 'adult' advice from a third source. I can even imagine tuning an AI like ChatGPT specifically for this purpose.
I would love to know your thoughts.
JoshuaACNewman t1_jby0lu4 wrote
Yes and no. Eliza did a great job, too, just by repeating things back.
The problem with ChatGPT is that it knows a lot but doesn’t understand things. It’s effectively a very confident mansplainer. It doesn’t know what advice is good or bad — it just knows what people say is good or bad. It hasn’t studied anything in depth; or, more accurately, it doesn’t have the judgment to know what to study with remove and what to believe because it only knows what people say.
I say this because, just like autocomplete was suggesting to Corey Doctorow the other day that he ask his babysitter “Do you have time to come sit [on my face]?” It doesn’t know what’s appropriate for a situation. It only knows what people think is appropriate for a situation. It’s appropriate to ask someone to sit on your face when that’s your relationship. It’s not appropriate to ask the babysitter. “Sit” means practically opposite things here that are similar in almost every way except a couple critical ones.