Viewing a single comment thread. View all comments

Kaekru t1_j9u4o4e wrote

Literally has nothing to do with developers, do you think every reply and context is put in there by someone?

If you prompt an ai chat bot to start talking about dark things SURPRISE it will start talking about dark things, you’re baiting the ai into that topic and then act surprised when it interacts back with you. This is exactly what all these “concerning” articles and news about ai are doing.

1

Ssider69 OP t1_j9u96pu wrote

Literally the developers are the one designing the system. Anything it does is in them ..their failure to recognize a problem is the same as directly causing it

I used literally because that, in gen z speak, means "no I really mean it"

−3

Kaekru t1_j9ubdjz wrote

That's not how fucking AI works my guy.

AI chatbots are not sentient, it will take the topic you are giving it and parrot and repeat it back to you with it's data on past conversations about it.

If you prompt the AI to talk about death, it is forced to talk about death, and will give you a reply about death if you start to prompt the AI to talk about self awareness, it will give you replies about self awareness.

That is how it works, simple, you can get and manipulate a chatbot to say pretty much anything you want given the correct triggers. It doesn't mean it's sentient or that it's replies where put in there or that it was pre programmed by a depressed developer.

3

Ssider69 OP t1_j9ucc5f wrote

Ai chatbots aren't sentient??? Holy fuck... you're kidding me....

Iow...no shit

My point..."my guy" is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing. And if it's not ready for prime time . .don't release it

Or is that too direct a concept ..."my guy"

AI chat is just another example of dressing up mounds of processing power to do something that seems cool but is not only flawed but useless.

It kind of sums up the industry really, and in fact most of the IT business right now

−4

Kaekru t1_j9ucvr1 wrote

>is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing

Any system that learns from experience will be fucked up if people fuck with it.

The same way if you raise a child to be a fucked up person they will become a fucked up adult.

You don't seem to understand jack shit about machine learning processes. A "fool proof" chat bot wouldn't be a good chat bot at all, since it wouldn't be able to operate outside its pre-determined replies and topics.

1

businessboyz t1_j9v3n69 wrote

>And if it’s not ready for prime time . .don’t release it

Good thing they didn’t and this has been an open waitlist beta so that the developers can gather real world experience and update the product accordingly.

You can’t ever anticipate all the ways that users will use your product and design a fail-proof piece of software. That’s why products go through many stages of testing and release with wider and more public audiences each iteration.

1