Comments

You must log in or register to comment.

codingai t1_iy8dva2 wrote

Youtubes yeatly revenue is tens of BILLIONS of dollars. 13 million dollars seem like a small amount TO ME. 🙄

5

swordgeek t1_iy8gzbw wrote

Fluff news if ever there was such a thing.

0

Kikoska85 t1_iy8ht0f wrote

That’s like the cigarette companies looking into fighting nicotine addiction

33

blueline7677 t1_iy8i9sp wrote

I hope this doesn’t get rid of conspiracy theory videos. Those are some of the more entertaining genre of YouTube videos and they already are less common now than they used to be because so many of those videos get demonetized as is

−1

Musetrigger t1_iy8jx4a wrote

Here come the conservatives screaming about free speech.

−6

maztow t1_iy8k5t6 wrote

You mean the same companies that intentionally altered search algorithms to slander or undercut politicians and allowed terrorists to post execution videos?

13

8instuntcock t1_iy8l0p3 wrote

Google and youtube are the same company so...

4

Minorous t1_iy8l5b5 wrote

"But but just hear me out!? Why can't you listen to my side!!! The truth hurts you?" -- Provides no sources or evidence, just a link to quackery science on Youtube.

−3

DJScrambles t1_iy8m3q2 wrote

They are investing to fight the misinformation they don't like and promote the misinformation they do like

22

NativeCoder t1_iy8maqz wrote

I don’t want some hippies in California being the arbitrators of truth in the world

18

ComradeCornflakes t1_iy8mblt wrote

Does anyone else find it hypocritical that they are doing this after removing the dislike feature? I can personally say that it is much harder to spot scams/misinformation without it.

seems like a step forwards after two steps back

85

Mirai4n t1_iy8mkkt wrote

Pixels suck with new exynos chips, fight this statement!

1

ndolphin t1_iy8mo16 wrote

While Twitter sits and goes DURRRRR...

2

JDGumby t1_iy8mtgt wrote

And by "misinformation" they mean "anything the USA's official enemies say that contradicts the USA's position on a subject".

8

TofuByte t1_iy8n0wm wrote

They need to invest more in removing “hate content”. Way too much racism and bigotry disguised as “oh we are just talking about stuff yo “. Several channels that should have been nuked years ago still spreading racism and hate. You can bet they won’t touch any channel generating money.

Edit: looks like this sub is no different than /g/ and /pol/ not really surprising though.

−8

yem_slave t1_iy8n405 wrote

Private companies deciding what is and isn't "true". What could go wrong

39

InHocWePoke3486 t1_iy8n7gz wrote

Doubtful. They profit from misinformation because of how engaging it is. They have no motive to actually fix this. Just window dressing.

1

TofuByte t1_iy8naz2 wrote

And we dont want closet klanners nor their supporters to have a platform to voice their hatred and racism.

Edit: for the incredibly thick individuals here, I wasn’t calling you a klanner. I was speaking in general about the rampant rise in racists using online platforms to spread their hate.

−9

zencat420 t1_iy8nt89 wrote

I don't think national pride or security have anything to do with it, this looks good for their brand, and makes them likeable and appealing. Follow the money!

3

BlazingJava t1_iy8oafb wrote

Lol most real libertarians also want freedom of speech. Why are people so uninterested in digging more info, wouldn't it be easier to explain rather than cersor online content?

4

BlazingJava t1_iy8opfg wrote

hatred and racism is always the get go but in the end the real reason is:

To censor political opposition

Cover corruption

Looking at all the shillers of CCP in western society, or Biden sons behaviour or his laptop or Tim Cook phonie "altruism" while exploring slavery in china and other african countries.

5

KamikazeCoPilot t1_iy8pfe9 wrote

What I want is to have honest and open discussions from all sides...even those that are WAY out there (note: I understand that way out there is subjective). Let them have their speech. Have an open, honest DISCUSSION about everything. Even have a discussion about the studies that are used to present a point of view. Who funds a study, how is the data collected...everything. And do it all with a handshake going in and a handshake leaving. Understand that it is a discussion and not an attack. Be open-minded about the whole thing.

I despise all of this Hate. Both NativeCoder and TofuByte offered up insults; to be fair Tofu was matching Native's energy. Regardless, it didn't help anything. I'm not saying we have to be friends. But let's stop hating each other over the small stuff.

1

saraphilipp t1_iy8pul6 wrote

While simultaneously collecting every bit of information about you to sell for profit. Including spamming you with misinformation.

3

concretemike t1_iy8pzv6 wrote

So, does Google admits that the Hunter Biden laptop is a real story now???? Misinformation control is what they say....what I see is information control....freedom of speech is the 1st Ammendment for a reason!!!!

1

me_at_myhouse t1_iy8q6fn wrote

Google and Youtube ARE the spreaders of misinformation.

The headline should be

Google and YouTube are investing to fight misinformation THAT THEY DON'T WANT YOU TO SEE BECAUSE IT DOES NOT AGREE WITH THEIR AGENDA.

−1

gregs1020 t1_iy8quua wrote

so worse than it already is, noted.

1

bored123abc t1_iy8rb37 wrote

Translation: They are investing to promote their version of “the truth” and their political agenda.

3

WyomingVet t1_iy8ro65 wrote

Is the misinformation what the government tells them it is? Like the Biden admin was doing with facebook?

2

monadyne t1_iy8vggk wrote

>a platform to voice their hatred and racism.

You don't get it. When some group has that platform and spews some hateful racist rhetoric, most people will look at it and conclude that those people are horrible. A tiny fraction will be in agreement with their awful ideas. That's how freedom works.

2

NPD_wont_stop_ME t1_iy8vtd7 wrote

Aren't Google and YouTube owned by the same company? Lmao

1

improper84 t1_iy8xwy8 wrote

It would have been nice if they had done it before a good third of the country descended into madness and radicalization. Doing it now is a bit like closing the barn door after the horse already got out and memorized Mein Kampf.

4

habeus_coitus t1_iy8y0il wrote

Right, but then that tiny fraction starts getting other ideas…ideas like “if I can discredit my political opponents’ message while broadcasting my own far and wide then no one can stop me from inflicting my hate upon the world”. And they’ll hide behind the First Amendment along the entire way, claiming that you’re the one trying to silence them. You cannot have reasoned, measured discussion with hateful racists. If you give them an inch they will take a mile, then tell you you’re the oppressor for simply wanting even half of that inch back.

2

ikilledtupac t1_iy8ygvt wrote

Google and YouTube are building a Potemkin City to prevent regulation, you mean.

1

yem_slave t1_iy8zfu9 wrote

The social media companies absolutely took down actual factual information and called it misinformation during the pandemic. But the govt liked the message so they agreed with it and even colluded with them to have it taken down.

7

summmbodyoncetoldme t1_iy921c6 wrote

"War on misinformation" . Ugh. I always question the ethics of these well-intentioned endeavors.

One person's fake news is another person's reality. Who's reality will these Tech behemoths allow to float to the top?

1

DirtyPolecat t1_iy92k6m wrote

>or Biden sons behaviour or his laptop

I don't get what people want to see out of this. How does Biden's son getting caught doing hookers and blow have anything to do with Joe Biden? I'd say the same thing for one of Trump's family.

Ever thought that maybe people just don't give a shit? No censoring neccessary.

4

summmbodyoncetoldme t1_iy92tmm wrote

Is this post filled with right-leaning commenters, or did we actually find something that both sides of the aisle agree on!?!?

1

TofuByte t1_iy948eq wrote

Small stuff? Why don’t you just say “I’m privileged and I don’t want to be inconvenienced “ There should be no open discussion where racists have a chance to speak. If you had children would you want child rapists to be able to have a platform to talk about “how they see things”? I understand that is how the church works but still. That would be like allowing Hitler a chance to speak and try justify to justify the slaughter. Once a person goes down a certain path they no longer deserve a voice nor a platform.

There is no grey area for discussion when it comes to topics of racism, homophobia and religious bigotry. You are either 100% against it or you are 100% for it. To see it any other way only proves that a person has never been a victim of this kind of hate and most likely contributes to it or benefits directly from it. I doubt Emmett Till could see a grey area that should allow room for discussion on racism as he was being tortured or when whites put the hoses to us simply for wanting to vote. Oh I wonder if those victims in Colorado were thinking about having a discussion with shooter on his transphobic view points as they were getting mowed down?

3

Valiantheart t1_iy94x9v wrote

"Google and Youtube looking to modify and create their own beneficial narratives"

- The actual Truth

−1

Brytheguy1978 t1_iy96ayk wrote

Did you mean they are fighting to keep peoples opinions off the town square? Misinformation is really nothing more than someone sharing their opinion on something. Should we ban bumperstickers or T-shirts with opinions like a sports team is the best? Think about the implications this could have on a democracy such as ours.

2

syahir77 t1_iy97la5 wrote

Investing to narrate the information

2

knokout64 t1_iy9gvy2 wrote

Isn't Reddit's biggest complaint about social media platforms that they don't do enough to combat misinformation? Either you control the information to a degree or you allow any and all "fake news" to avoid controlling what the truth is.

If you think YouTube is picking the wrong narrative then by all means they're open to criticism, but complaining about the idea that they'd try and restrict blatant lies because they might support the lies instead seems hypocritical.

6

yem_slave t1_iy9ih0u wrote

Reddit is very censored and dissenting views are often banned from subs, so you are not seeing a representative sample of people, but rather you're seeing an echo chamber of people who all believe in censoring things that they don't like.

3

[deleted] t1_iy9npad wrote

The entirety of the origin is speculation at this point.

Main takeaway from my comment should be that it was labeled misinformation and you would get banned for sharing it but now, it’s an established, probable theory as published by the Wall Street Journal

Same stuff happened with the Hunter Biden laptop.

https://www.wsj.com/articles/another-potential-covid-19-lab-leak-clue-china-11644615472

1

Zerksys t1_iy9rhvz wrote

So the lab leak theory is not confirmed as fact, there's just a bit more evidence for it that brings slightly more credibility to it as a hypothesis? That's a long way from being confirmed as fact.

Same thing with the Hunter laptop stuff. Random computer repair person gets dropped a laptop with an unknown chain of custody by someone claiming to be Hunter Biden at a time when it would have been the most damaging to Joe Biden's campaign for president. The laptop then falls under the custody of not the police but Rudy Giuliani of all people. Never mind that the forensic evidence for the laptop is muddled at best.

This is why information surrounding controversial subjects from the conservative side gets marked as misinformation. Just like your previous post, you take speculation and misconstrue it as fact. So yes, it is still misinformation to say that the lab leak theory is a proven fact and it is still misinformation to say that the Hunter Biden laptop proves wrongdoing by Biden.

I'm not saying that in then future, more information will not surface about these subjects that turn them into facts, but at present, it's all speculation and should be treated as such.

1

leoheck t1_iy9uos3 wrote

lol, they are the missinformation itself.

0

Zerksys t1_iy9y6uu wrote

I believe the big difference between you and I is that I don't see misinformation filtering as a bad thing because the internet is not centrally controlled. The market for web apps on the internet is pure and unregulated capitalism which is how it should stay.

No one is forcing you to use Google or YouTube. People use them because they are good tools and they, more often than not, deliver accurate information. In the event that Google falls under the influence of a hypothetical deep state, it would become apparent very quickly by using another search engine. You would get completely different results. Over time, the inaccuracy of the information would cause a sizable amount of people to swap over to other tools. That's what's great about capitalism. It gives you options. The internet is democracy.

Another hard pill for free speech absolutists to swallow is that most people are just incapable of "doing their own research" at a level that is required to understand complex topics. I direct your attention to this site.

https://www.wyliecomm.com/2021/08/whats-the-latest-u-s-literacy-rate/

About half of the US has a literacy rating that qualifies as below the reading capabilities of an eight grader. Only 12 percent of the country has a literacy rate that is proficient enough to identify sources as unreliable. The theory that people should be able to do their own research is great, but this actually just goes to prove my point. Doing your own research is impossible for the average person when there is so much misinformation out there.

The scariest thing for me is not information being centrally controlled. There's far too many tools that allow us to get the information we want. Information suppression would also have to come with large social changes that change the fabric of western society. The scariest thing for me is that foreign governments can use misinformation to control a population and cause chaos among its citizens. That's what's scariest to me.

2

Championship-Stock t1_iy9ywz6 wrote

There is a documentary on how YouTube was colluding with the Kremlin to spread misinformation. Check it out, it’s on YouTube..

−1

turnip_burrito t1_iya09ls wrote

Lmao, you think people are able to do their own research and come to sensible conclusions. We're human, not purely rational beings with unlimited time to research. We like passing along entertaining stories and have a billion internal biases we don't check unless we have the time. Almost nobody has the time. Who then will be able to reliably "do their own research"?

You need education (from real experts with credentials) and time to do actual research and come to an informed opinion.

1

Brytheguy1978 t1_iya329e wrote

I don’t know if you misunderstood what I was saying. What kind of misinformation are they fighting because the article didn’t specify specific details about misinformation. So if you basically say hey, you like that sports team because they’re the best, that is clearly an opinion. If I go out and say New York City is the best city in the world, obviously, that’s my opinion. If I go out and say that Tesla is the best vehicle ever made, that can be considered an opinion as well. The point I was making was unless they can specify exactly what kind of misinformation they’re talking about, they could be banning people’s opinions, or their rights to free speech, causing a lot of frustration and that could be considered a threat to our democracy. The government doesn’t have any guidelines on what can and cannot be said either. But honestly, this is for everyone in the United States and people should be frightened by this move. Any threat to our democracy should be taken seriously. You have a right to protest and speak your mind, and that should never be taken away from you. We do not live in Communist China.

2

[deleted] t1_iyaaip6 wrote

But experts are free to publish any findings or opinion pieces they want in my scenario.

And it’s irrelevant if they choose to research. They have the freedom to do so or the freedom to choose ignorance.

In your scenario we would have people at the tippy top choose the experts that are allowed to publish their opinions which would obviously conform to the few individuals bias.

We saw this at Twitter when legitimate sources were getting banned, even major news organizations during 2020 if they didn’t conform to the narratives

1

[deleted] t1_iyaaw14 wrote

You can pretend all players in the tech industry are equal but that’s false.

Google and Apple could literally kill Twitter by simply delisting it from their App Store if Elon doesn’t run it or ban “misinformation” as they see fit.

1

ContinuousZ t1_iyalhou wrote

>combat misinformation

The best way to combat misinformation is through discussion and evidence. Censorship is a terrible way to combat misinformation and more often than not it gives misinformation more power.

1

turnip_burrito t1_iyaxz76 wrote

Misinformation in the article clearly refers to things like covid or political false information, presented as fact. An opinion like "Teslas are the best car" is very clearly not the meaning of the word misinformation. Don't be obtuse. NOBODY would classify that as misinformation.

0

chameleondoesitagain t1_iyayyln wrote

Define misinformation? Its not the first time google failed big time in this and we dont need big tech to pick our select of their truths.

1

Bipolarbearingit t1_iyb6f2h wrote

You mean control information. Misinformation is combated with free speech and intellectual debate.

Not censorship. Which happens to be the favorite tool of these organizations.

Censorship only decreases trust.

1

Qlinkenstein t1_iybd37g wrote

All while Twitter is going in the opposite direction.

1

Autistosaur t1_iybg4wk wrote

Are they going to have this article deleted, then?

1

Zerksys t1_iybgcij wrote

They're not equal but making bad decisions will cause them to lose market share. Making a disastrous decision like intentionally giving misinformation to people is enough to lose you your core customer base.

1

turnip_burrito t1_iyc3fue wrote

Clearly if you are being banned or silenced for having an opinion, then those silencing or banning are in the wrong. That isn't stopping misinformation. It's something else (silencing opinions). That should not stand.

Stopping the spread of misinformation is a different issue. I'd like to know where you saw these medical officials said the vaccine doesn't reduce the chance of transmission?

1

OraxisOnaris1 t1_iyc7gjq wrote

Meanwhile, Elon has decided misinformation is welcome on Twitter. Unless you make fun of him. Then he gets super offended and flogs one of his sycophants to swing the ban hammer.

1

I_ONLY_PLAY_4C_LOAM t1_iyc8gih wrote

This so naive lol. Discussion and evidence don't matter in the face of engagement algorithms. By the time you debunk something with a nuanced conversation, 10 more conspiracy theories with no evidence are already circulating.

0

NotTooDistantFuture t1_iycpwi9 wrote

They still can’t figure out how to ban spambots from replying to every comment on a video saying “you won a prize. Contact me on telegram point emoji to username”

This has been a widespread issue on almost every reasonably large video to the point that most creators have had to post to Twitter and even create videos saying it’s not them and they won’t do it. It’s still happening and they haven’t even mixed up the strategy. It’s the same damn messages spammed to every comment as a reply.

There are a hundred ways to combat this with very simple rules like no phone numbers in usernames, no replying to comments in a channel if your username matches the channels username at all, limiting number of replies one account can post to one video, limiting the number of same comments one account can post, actually responding to spam reports, giving tools to creators to limit types of interactions, cross correlating spam accounts to IP origins and browser user agents, flagging words like “contest” or “contact”.

They can’t even do this much and here you don’t even have to wrangle with the difficult question of “what is true?”.

1