Viewing a single comment thread. View all comments

VincentNacon t1_itbo021 wrote

Shhhh... let him fail some more, we want his metaverse to be successful at dragging his company and him down with it.

64

domino2064 t1_itbshgt wrote

Eh. It probably won't though. Not as long as grandma and the aunts and uncles keep committing themselves to social data collection and various experiments.

Want to kill meta and change how social media impacts us? Ban automatic, opt in clauses to these experiments in user agreements. Require users to have to manually review a separate agreement that they have to consent to with a digital signature and initials in 3+ places and then require full disclosure for all experiments.

In other words, if your grandma and/or her data was used in an experiment, either directly or indirectly, a simple English email must be sent to her that describes the experiment, the purpose, the control and the variables, as well as any placebos, and where she fell into all of it. A law like this would essentially break Metas current model and damage their revenue stream.

21

tinytooraph t1_itbtwrk wrote

Are the experiments critical to their ad revenue stream? I understand your position against them, but would requiring informed consent meaningfully change how they make money?

3

domino2064 t1_itbuzb8 wrote

Yes. Ad revenue is definitely their cash cow, but the social experimentation has been another major source of income, both in terms of data they can sell and data they could use to improve the impacts of their platform. They've been surprisingly open about some of the experiments, as they've had impacts on how their algorithms work, but even then, there's been little in the way of regulations let alone ethics.

And it isn't just Facebook - reddit has performed social experiments on its user base as well as other companies, except that, as for the experiments revealed to the public, Facebook has admitted to adversely impacting the livelihoods of large groups of users.

In one experiment that Andrew Marantz cited in his book, Anti-social: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation, Facebook intentionally subjected one user group to positive posts and news articles only and another to negative content. Those exposed to the negative content expressed emotional distress and evidence that their quality of life suffered at a worse rate than at the rate that the other group, exposed to positive content, saw their livelihoods improve. And it's thought that this is one of the more impactful experiments when it came to tuning the overall algorithm behind Facebooks news feed.

5