Comments

You must log in or register to comment.

CrucioIsMade4Muggles t1_ja8w47k wrote

Hashes...so someone changes a single pixel and this system doesn't work.

Top notch work Meta. /s

6

Gerzhus t1_ja8xc0x wrote

There are hash functions that work on images that aren’t susceptible to basic alterations. They aren’t the ones you would use for say hashing passwords.

4

Disastrous_Court4545 t1_ja8z3x1 wrote

Mind sharing a few of those hash functions?

1

Gerzhus t1_ja95edw wrote

Reposting due to automod.

“Known as PDQ and TMK+PDQF, these technologies are part of a suite of tools we use at Facebook to detect harmful content, and there are other algorithms and implementations available to industry such as pHash, Microsoft’s PhotoDNA, aHash, and dHash. Our photo-matching algorithm, PDQ, owes much inspiration to pHash although was built from the ground up as a distinct algorithm with independent software implementation.”

I don’t know if all are open source, some might be proprietary.

Source: meta/fb blog post about how they fight CSAM.

1

HanaBothWays t1_ja9wazq wrote

Unless I’m mistaken these are the same hash functions social media platforms use to detect and take down copyrighted media, too.

1

CrucioIsMade4Muggles t1_ja92h9v wrote

That's good to know. I'm still fairly skeptical. This seems like the literal least effort approach one could take and still claim they are doing something--to the point where I think they will have spent more money advertising their effort than they spent on developing the system itself.

1

HanaBothWays t1_ja93hzi wrote

This is the same system that’s used to detect and take down Child Sexual Abuse Material (CSAM). It’s been around for years. Meta is just expanding the criteria for what images (or hashes of images) they will use it on.

The CSAM system was not previously used to detect and take down nude photos that teens shared consensually: now, it is, even if the subject of the photo has since become a legal adult.

2

nooshaw t1_ja95c6l wrote

That's not how CSAM hashes work.

3

HanaBothWays t1_ja97j9u wrote

Most people did not read the article at all and don’t realize this is an expansion of the existing CSAM takedown tool that Facebook has had in place for many years. (Most other social media sites have very similar tools.)

2

ampjk t1_ja911vh wrote

Is this going to be like the poor people who comb the internet for child porn to remove and commit suduko at a high rate compared to other moderators from large companies.

1

HanaBothWays t1_ja916ts wrote

I suspected that this would basically work like the tools used to recognize and spike Child Sexual Abuse Material (CSAM) images and it actually is - it’s the same tools and the same database! This is basically expanding the eligibility criteria for what can go into the database.

Previously if you sent your high school sweetheart a nude selfie and that person did whatever with it, you didn’t have a lot of options, but now you can upload a hash of the picture (not the actual picture) to the database and it will get taken down.

Also if you are a legal adult now but have nude photos of yourself from when you were a minor floating around, you can upload hashes fo the database and have them taken down.

1

[deleted] t1_ja91cmq wrote

[deleted]

1

HanaBothWays t1_ja94k72 wrote

We are talking about situations where a minor consented to share an intimate photo with another party having the understanding that the other party would not spread it around in public…and the other party did so anyway.

When this kind of thing happens between adults it’s called “revenge porn” and the person who spread the photo is often subject to civil or criminal liability for doing so.

If you are seriously arguing that someone deserves to have nude photos of themselves as a minor floating around to “teach them a lesson” when having it happen to them as an adult would make them victims of a crime, you probably need to log off for a while.

4

curiousdressing t1_ja91x15 wrote

if it go on the internet it stay in the internet

1

iamComfortablyDone t1_ja93hjg wrote

Not that simple. If one's phone is stolen or hacked, photos/videos that weren't on the internet suddenly find themselves on the internet. Without knowledge or consent.

1

HanaBothWays t1_ja95sqm wrote

You mean like actual child porn? This is basically just expanding on the system they use to detect and remove child porn.

1

Arrowtica t1_ja99ou7 wrote

You know what helps prevent that spread more? Not existing.

1

Kira9059 t1_ja8zd3x wrote

Nice! This has been needed for a while now.

0

iamComfortablyDone t1_ja93104 wrote

Is this Zuckerberg's way of skirting huge SEC fines? Isn't this the same Instagram that peddles videos of animal cruelty, sexual assault, torture and killing?

0

t0slink t1_jaa7hsp wrote

> peddles videos of animal cruelty, sexual assault, torture and killing?

You realize Reddit has all of that content openly, yeah?

This isn't "skirting" anything, it's a legitimately useful tool for teens.

2

iamComfortablyDone t1_jad17bv wrote

So tribal, your response. Is this a sponsored statement, from Meta? Is this Mark?

1

apextek t1_ja9789e wrote

This is so disconnected. Teens dont use fb or insta, they're on tiktok.

0

HanaBothWays t1_ja9t4ll wrote

Lots of young people use Instagram.

And if you read the article (what a concept LOL), this can be used for photos taken and spread on Facebook a long time ago. If your cad of a high school boyfriend posted the pictures you gave him 15-20 years ago on Facebook, you can send a hash to this thing to have them removed.

5

CenlTheFennel t1_jab83m6 wrote

So they should do nothing? This is an objectively awful take.

1

bokbie t1_ja9a3hf wrote

Why can’t the phones just not save an intimate photo of a teen?

−1