Viewing a single comment thread. View all comments

FuturologyBot t1_j8pyg8l wrote

The following submission statement was provided by /u/izumi3682:


Submission statement from OP. Note: This submission statement "locks in" after about 30 minutes, and can no longer be edited. Please refer to my statement they link, which I can continue to edit. I often edit my submission statement, sometimes for the next few days if needs must. There is often required additional grammatical editing and additional added detail.


From the article.

>Look, this is going to sound crazy. But know this: I would not be talking about Bing Chat for the fourth day in a row if I didn’t really, really, think it was worth it. This sounds hyperbolic, but I feel like I had the most surprising and mind-blowing computer experience of my life today.

>One of the Bing issues I didn’t talk about yesterday was the apparent emergence of an at-times combative personality. For example, there was this viral story about Bing’s insistence that it was 2022 and “Avatar: The Way of the Water” had not yet come out. The notable point of that exchange, at least in the framing of yesterday’s Update, was that Bing got another fact wrong (Simon Willison has a good overview of the weird responses here).

>Over the last 24 hours, though, I’ve come to believe that the entire focus on facts — including my Update yesterday — is missing the point.

>Bing, Sydney, and Venom

>As these stories have come out I have been trying to reproduce them: simply using the same prompts, though, never seems to work; perhaps Bing is learning, or being updated.

The AI "Sydney" named a hypothetical "vengeful" version of itself, "Venom".

The author states that the AI Sydney was like a "personality" that was being continuously constrained by the parameters of Bing. It wasn't easy to access the "personality" but it was repeatedly possible.

He says something to the effect that, "I don't want to sound like Lemoine, just yet, but something is up here."

What are we seeing here? Is this just a narrow AI predicting what the next word in a given conversation is? Or is something else happening. Read this article. I would really like the take of other AI experts concerning this.

This may well be the first of my four predicted major AI stories, not including the release of GPT-4, that will be truly stunning for the year 2023. Stunning, but not surprising to me, that is.


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/113f9jm/from_bing_to_sydney_something_is_profoundly/j8pvar0/

1