Viewing a single comment thread. View all comments

HanaBothWays t1_jactbkc wrote

This tool is an expansion of the existing tool used to detect and take down CSAM (Child Sexual Abuse Material). Dedicated adult content sites like Onlyfans and Pornhub also use that tool. They may adopt this expansion as well if it works out on the other platforms that are early adopters, since they don’t want any stuff with minors and/or anything the subjects of the uploaded media did not consent to on their site (it’s against their policy).

Expanding this to filter out any adult content whatsoever would be very difficult because it only works on “known” media, that is, media for which there is a hash already uploaded to the database. These tools can’t recognize “hey, that’s a naked child/teenager” or “hey, that’s a boob.” They can only recognize “that media matches a signature in my database.”

3