Submitted by goki7 t3_11dj1ey in technology
Comments
Gerzhus t1_ja8xc0x wrote
There are hash functions that work on images that aren’t susceptible to basic alterations. They aren’t the ones you would use for say hashing passwords.
Disastrous_Court4545 t1_ja8z3x1 wrote
Mind sharing a few of those hash functions?
[deleted] t1_ja914ol wrote
[removed]
AutoModerator t1_ja914rk wrote
Unfortunately, this post has been removed. Facebook links are not allowed by /r/technology.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
Gerzhus t1_ja95edw wrote
Reposting due to automod.
“Known as PDQ and TMK+PDQF, these technologies are part of a suite of tools we use at Facebook to detect harmful content, and there are other algorithms and implementations available to industry such as pHash, Microsoft’s PhotoDNA, aHash, and dHash. Our photo-matching algorithm, PDQ, owes much inspiration to pHash although was built from the ground up as a distinct algorithm with independent software implementation.”
I don’t know if all are open source, some might be proprietary.
Source: meta/fb blog post about how they fight CSAM.
HanaBothWays t1_ja9wazq wrote
Unless I’m mistaken these are the same hash functions social media platforms use to detect and take down copyrighted media, too.
CrucioIsMade4Muggles t1_ja92h9v wrote
That's good to know. I'm still fairly skeptical. This seems like the literal least effort approach one could take and still claim they are doing something--to the point where I think they will have spent more money advertising their effort than they spent on developing the system itself.
HanaBothWays t1_ja93hzi wrote
This is the same system that’s used to detect and take down Child Sexual Abuse Material (CSAM). It’s been around for years. Meta is just expanding the criteria for what images (or hashes of images) they will use it on.
The CSAM system was not previously used to detect and take down nude photos that teens shared consensually: now, it is, even if the subject of the photo has since become a legal adult.
nooshaw t1_ja95c6l wrote
That's not how CSAM hashes work.
HanaBothWays t1_ja97j9u wrote
Most people did not read the article at all and don’t realize this is an expansion of the existing CSAM takedown tool that Facebook has had in place for many years. (Most other social media sites have very similar tools.)
ampjk t1_ja911vh wrote
Is this going to be like the poor people who comb the internet for child porn to remove and commit suduko at a high rate compared to other moderators from large companies.
HanaBothWays t1_ja916ts wrote
I suspected that this would basically work like the tools used to recognize and spike Child Sexual Abuse Material (CSAM) images and it actually is - it’s the same tools and the same database! This is basically expanding the eligibility criteria for what can go into the database.
Previously if you sent your high school sweetheart a nude selfie and that person did whatever with it, you didn’t have a lot of options, but now you can upload a hash of the picture (not the actual picture) to the database and it will get taken down.
Also if you are a legal adult now but have nude photos of yourself from when you were a minor floating around, you can upload hashes fo the database and have them taken down.
[deleted] t1_ja91cmq wrote
[deleted]
HanaBothWays t1_ja94k72 wrote
We are talking about situations where a minor consented to share an intimate photo with another party having the understanding that the other party would not spread it around in public…and the other party did so anyway.
When this kind of thing happens between adults it’s called “revenge porn” and the person who spread the photo is often subject to civil or criminal liability for doing so.
If you are seriously arguing that someone deserves to have nude photos of themselves as a minor floating around to “teach them a lesson” when having it happen to them as an adult would make them victims of a crime, you probably need to log off for a while.
curiousdressing t1_ja91x15 wrote
if it go on the internet it stay in the internet
iamComfortablyDone t1_ja93hjg wrote
Not that simple. If one's phone is stolen or hacked, photos/videos that weren't on the internet suddenly find themselves on the internet. Without knowledge or consent.
HanaBothWays t1_ja95sqm wrote
You mean like actual child porn? This is basically just expanding on the system they use to detect and remove child porn.
Arrowtica t1_ja99ou7 wrote
You know what helps prevent that spread more? Not existing.
noxii3101 t1_jaa559h wrote
not a hot dog
Quantum-traveler88 t1_jabw6rn wrote
Lol took them about 10+ years
Kira9059 t1_ja8zd3x wrote
Nice! This has been needed for a while now.
iamComfortablyDone t1_ja93104 wrote
Is this Zuckerberg's way of skirting huge SEC fines? Isn't this the same Instagram that peddles videos of animal cruelty, sexual assault, torture and killing?
t0slink t1_jaa7hsp wrote
> peddles videos of animal cruelty, sexual assault, torture and killing?
You realize Reddit has all of that content openly, yeah?
This isn't "skirting" anything, it's a legitimately useful tool for teens.
iamComfortablyDone t1_jad17bv wrote
So tribal, your response. Is this a sponsored statement, from Meta? Is this Mark?
apextek t1_ja9789e wrote
This is so disconnected. Teens dont use fb or insta, they're on tiktok.
HanaBothWays t1_ja9t4ll wrote
Lots of young people use Instagram.
And if you read the article (what a concept LOL), this can be used for photos taken and spread on Facebook a long time ago. If your cad of a high school boyfriend posted the pictures you gave him 15-20 years ago on Facebook, you can send a hash to this thing to have them removed.
CenlTheFennel t1_jab83m6 wrote
So they should do nothing? This is an objectively awful take.
bokbie t1_ja9a3hf wrote
Why can’t the phones just not save an intimate photo of a teen?
CrucioIsMade4Muggles t1_ja8w47k wrote
Hashes...so someone changes a single pixel and this system doesn't work.
Top notch work Meta. /s