Bing Chatbot Names Foes, Threatens Harm and Lawsuits tomshardware.com Submitted by StrawberryFair524 t3_11909mx on February 22, 2023 at 2:12 PM in nottheonion 18 comments 44
AllDarkWater t1_j9k0lmp wrote on February 22, 2023 at 3:24 PM Saying you want to hurt someone, and when prompted about how offering up a bunch of suicide prevention information makes me wonder if it's actually speaking on two levels. Does it actually intend to trick them into suicide, or more Russian way? Permalink 2
Viewing a single comment thread. View all comments