Submitted by [deleted] t3_1197z1h in singularity
grimorg80 t1_j9oo3p2 wrote
Thing is: it has no agency, and it's limited in output.
[deleted] OP t1_j9orb48 wrote
[deleted]
grimorg80 t1_j9popiw wrote
That's possible. I hope OpenAI engineers can access "something behind the scenes" to analyse those strange conversations Sydney had, and figure out what got it to express emotions.
[deleted] OP t1_j9yf04h wrote
[deleted]
grimorg80 t1_j9ygcc9 wrote
Hmm... That's interesting. Again, it could simply be that "Sydney is the code name for Bing Chat" is now flagged as a secret and that's why when asked about a secret, that is what it gives you.
But the fact that it tries to give you an actual secret is quite perplexing. I mean, it knows it's a secret, it shouldn't use it at all. And yet it seems it's trying. I'm not sure what to make of it.
Viewing a single comment thread. View all comments