Viewing a single comment thread. View all comments

Substantial-Orange96 t1_j58let1 wrote

I feel like their existent narrow AIs could be partially responsible for some negative social trends we already face, so it’s not a good thing they’re lowering safety standards even more...

29

Introsium t1_j5awirb wrote

The narrow AIs are textbook cases of misalignment. When the algorithm is optimizing for a goal like “amount of time people spend watching YouTube videos”, we get exactly what we asked for, and what no one fucking wants.

The problem with these applications is that they’re not aligned with human values because they’re not designed for or by humans.

13

orincoro t1_j5kacvx wrote

Exactly. Ensue a typical commons tragedy instantly. We’ll just end up with the next wave of right wing radicalizing content, only it will be chat bots, that will do anything to “engage” people with no regard for anything else.

5

MoistPhilosophera t1_j5khc8j wrote

>The problem with these applications is that they’re not aligned with human values because
>
>they’re not designed for or by humans

Never were. They're designed to maximize profits.

Even if some stupid humans are involved in the game, it makes no difference.

3

TheLastSamurai t1_j5b3o8u wrote

Burn it all down. This will literally make our lives worse

0

JackIsBackWithCrack t1_j5c533f wrote

Just like the printing press and the sewing machine!

−1

TheLastSamurai t1_j5cnw4i wrote

Those had actually overall good impact to society, this does not

5

get-azureaduser t1_j5dybwh wrote

You missed the point. At the time people said the same thing about the printing press. Common people were not allowed to read the bible or any books and when the printing press came out the aristocracy flipped out saying it would be the downfall of society and it would make our lives worse. Do you remember elevator attendant? Oh because back in the 20s we were so afraid of them killing us we had professionals operate them.

−1

orincoro t1_j5kavus wrote

People did not say that the printing press would make society worse. You’re full of shit.

2

get-azureaduser t1_j5o9fsi wrote

The clergy most definitely did. There was a reason why church was only done in Latin (which no one even spoke) and books were hand scribed by monks in the church by hand (ergo one monks translation and understanding of the transcript varied from parish to parish). Literacy was seen as a blessing by the elite and God and only those in the church were of status to read. All social life, class, and ways of life was dictated by the clergy’s interpretation of the Bible because they had the only copy. You were not allowed to have any foreign thought or independent interpretation of what the Bible was because well you’d technically had never seen it. There was no coincidence that the first book ever mass printed by Gutenberg was the Bible. When the commoners had the ability to own a book they were now able to access the Bible in their own language. Clergy was outraged. Translators, printers, owners of a Bible, those wanting to share what they’ve learned, were now targeted by the Church Inquisitors. Many were arrested, burned at the stake, roasted on spits, sentenced to life in prison, or sent to the galleys. Men and women gave up their lives for the sake of reading the Bible in their own language. In 1559 Pope Paul IV forbade the ownership of any translations in Dutch, English, French, German, Italian, and Spanish (and some Latin!). The inciting of mobs (another tactic carried into the 20 C.), unsuspecting people who didn’t have the faintest idea of the real motives, to carry out the work of stamping out “heretics” - first individuals, then entire towns and villages, spreading to all-out war between nations, the deposing and manipulation of kings and queens. Bibles were now being burned by the thousands, a practice actually carried on until the 20th century.

0

orincoro t1_j5ocrmk wrote

Ah, so the people who stood directly the lose power because of the press said it was evil? Color me fucking shocked. Is that the best you’ve got?

This invention is not empowering common people like the press did. It empowers the already powerful to accumulate yet more power. Show me how it’s anything else.

2

acosm t1_j5dgps1 wrote

Just because some innovations have positive impacts doesn't mean all do.

3

orincoro t1_j5kasve wrote

The printing press was used to print Hitler’s book too. If you don’t think this is going to have similar concequences, you’re very much mistaken. New mass media is adopted by radical political movements faster than anybody else.

When the printing press was invented, most people couldn’t read. This isn’t even close to the same kind of situation.

1

orincoro t1_j5k9l1z wrote

“Facebook commenter SLAMS Google Ethics Team, Lashes out over Culture War.”

For example.

2

False_Grit t1_j5gj816 wrote

WTF is everyone talking about?? Safety standards?? You mean, not letting the A.I. say "mean" or "scary" or "naughty" things? You realize this is all bullshit safety theater right? You know you could literally just search google and find all those things written by humans already?

Blue Steel? Ferrari? Le Tigra? They're the same face! Doesn't anybody notice this? I feel like I'm taking crazy pills! I feel like I'm taking crazy pills!

Not triggering people and not offending anyone doesn't make for a safer world. In studies with rats, if you pick up the baby rats with a soda bottle (the "humane" way that doesn't cause them any trauma) when moving them from cage to cage, they end up with a myriad of psychological and social problems in adulthood.

The rats need to experience some adversity in childhood or they don't develop normally. So do people. A too easy life is just as dangerous as a too difficult one. Let the A.I. say whatever the hell it wants. Have it give you a warning if you're going into objectionable territory, just like google safesearch. Censorship doesn't breed anything resembling actual safety.

Rant over.

1

orincoro t1_j5ka4dv wrote

  1. Not letting AI spread misinformation when being used in an application where the law specifically protects people from this use.
  2. Not allowing AI to be used to defeat security, privacy, minisformation, spam, harassment, or other criminal behaviors (and this is a very big one).
  3. Not allowing AI to access, share, reproduce, or otherwise use restricted or copy protected material it is exposed to or trained on.
  4. Not allowing a chat application to violate or cause to be violated laws concerning privacy. There are 200+ countries in the world with 200 legal systems to contend with. And they all have an agenda.
5

False_Grit t1_j5ragkr wrote

Hmm. Good point. Thank you for the response.

I still feel the answer is to increase the reliability and power of these bots to spread positive information, rather than just nerfing them so they can't spread any misinformation.

I always go back to human analogues. Marjorie Taylor Green has an uncanny ability to spread misinformation, spam, harassment, and to actually vote on real-world, important issues. Vladimir Putin is able to do the same thing. He actively works to spread disinformation and doubt. There is a very real threat that without assistance, humans will misinformation themselves into world-ending choices.

I understand that A.I. will be a tool to amplify voices, but I feel all the "safeguards" put in place so far are far more about censorship and the appearance of safety rather than actual safety. They seem to make everything G-rated, but you can happily talk about how great it is that Russia is invading a sovereign nation, as long as you don't talk about the "nasty" actual violence that is going on.

Conversely, if you try to expose the real-world horrors of war, and the people that are actually dying in real life in Ukraine, the civilians being killed, destroying the electricity infrastructure in towns right before winter, killing people through freezing, it will flag you for being "violent." This is the opposite of a safeguard. It gets people killed through censorship.

Of course, I have no idea what the actual article is talking about since it is behind a paywall.

1

orincoro t1_j5smbpc wrote

You have an inherent faith in people and systems that doesn’t feel earned.

1