Submitted by Ssider69 t3_11apphs in technology
Kaekru t1_j9ucvr1 wrote
Reply to comment by Ssider69 in Microsoft Bing AI ends chat when prompted about 'feelings' by Ssider69
>is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing
Any system that learns from experience will be fucked up if people fuck with it.
The same way if you raise a child to be a fucked up person they will become a fucked up adult.
You don't seem to understand jack shit about machine learning processes. A "fool proof" chat bot wouldn't be a good chat bot at all, since it wouldn't be able to operate outside its pre-determined replies and topics.
Viewing a single comment thread. View all comments