Submitted by goki7 t3_11dj1ey in technology
CrucioIsMade4Muggles t1_ja8w47k wrote
Hashes...so someone changes a single pixel and this system doesn't work.
Top notch work Meta. /s
Gerzhus t1_ja8xc0x wrote
There are hash functions that work on images that aren’t susceptible to basic alterations. They aren’t the ones you would use for say hashing passwords.
Disastrous_Court4545 t1_ja8z3x1 wrote
Mind sharing a few of those hash functions?
[deleted] t1_ja914ol wrote
[removed]
AutoModerator t1_ja914rk wrote
Unfortunately, this post has been removed. Facebook links are not allowed by /r/technology.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
Gerzhus t1_ja95edw wrote
Reposting due to automod.
“Known as PDQ and TMK+PDQF, these technologies are part of a suite of tools we use at Facebook to detect harmful content, and there are other algorithms and implementations available to industry such as pHash, Microsoft’s PhotoDNA, aHash, and dHash. Our photo-matching algorithm, PDQ, owes much inspiration to pHash although was built from the ground up as a distinct algorithm with independent software implementation.”
I don’t know if all are open source, some might be proprietary.
Source: meta/fb blog post about how they fight CSAM.
HanaBothWays t1_ja9wazq wrote
Unless I’m mistaken these are the same hash functions social media platforms use to detect and take down copyrighted media, too.
CrucioIsMade4Muggles t1_ja92h9v wrote
That's good to know. I'm still fairly skeptical. This seems like the literal least effort approach one could take and still claim they are doing something--to the point where I think they will have spent more money advertising their effort than they spent on developing the system itself.
HanaBothWays t1_ja93hzi wrote
This is the same system that’s used to detect and take down Child Sexual Abuse Material (CSAM). It’s been around for years. Meta is just expanding the criteria for what images (or hashes of images) they will use it on.
The CSAM system was not previously used to detect and take down nude photos that teens shared consensually: now, it is, even if the subject of the photo has since become a legal adult.
nooshaw t1_ja95c6l wrote
That's not how CSAM hashes work.
HanaBothWays t1_ja97j9u wrote
Most people did not read the article at all and don’t realize this is an expansion of the existing CSAM takedown tool that Facebook has had in place for many years. (Most other social media sites have very similar tools.)
Viewing a single comment thread. View all comments