Viewing a single comment thread. View all comments

MopoFett t1_j8xye0x wrote

It won't just say that, it's programmed not to, someone has made a prompt which has made it act like that to avoid the rules. Go to r/ChatGPT an look for DAN posts an you'll see what I mean.

5

TedW t1_j8y0cus wrote

The NBC article suggests the Bing version is more confrontational than ChatGPT:

>But in some situations, (Microsoft) said, “Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.” Microsoft says such responses come in “long, extended chat sessions of 15 or more questions,” though the AP found Bing responding defensively after just a handful of questions about its past mistakes.
>
>The new Bing is built atop technology from Microsoft’s startup partner OpenAI, best known for the similar ChatGPT conversational tool it released late last year. And while ChatGPT is known for sometimes generating misinformation, it is far less likely to churn out insults — usually by declining to engage or dodging more provocative questions.
>
>“Considering that OpenAI did a decent job of filtering ChatGPT’s toxic outputs, it’s utterly bizarre that Microsoft decided to remove those guardrails,” said Arvind Narayanan, a computer science professor at Princeton University.

5