Submitted by originmsd t3_10qxr78 in Futurology
GukkiSpace t1_j6t8w37 wrote
My bet is the same companies who make the deepfake tools will make deepfake detection tools. Make a problem, maybe even make the problem free for the common man, then sell the solution to parties who need it. Pretty much straight outta apples playbook.
fruor t1_j6u59yn wrote
This is the correct answer. Training AI means competing AIs vs each other. You can't improve the skills to achieve a goal without also advancing the challenges it faces.
The companies holding the best algorithms to evade detection will always be the companies who have the best detection algorithm.
oddinpress t1_j6vghfg wrote
Yeah except at the end of the day a video is just pixels. There's a point where it's physically impossible to distinguish a video that was AI altered from a video that may as well be a recording of a real event.
Even metadata can be bypassed if someone really wants do.
This solution can't be sold if it's not possible
GukkiSpace t1_j6vi39d wrote
Well yeah, you take a still image or something but when you have anything that is more than a single sample the job becomes much easier, it’s a game of sequential comparative data.
imuniqueaf t1_j6w5e37 wrote
Bingo.
"Those ones who sell the panic, also sell the cure"
Nebula_Zero t1_j6wn24m wrote
They can probably hide it the same way you hide Photoshop edits. The way an edit is detected is by comparing the pixels to the rest of the image and a computer can very easily see the differences in resolution and other small things. If someone just makes a high resolution image and compresses it enough that there are slight compression artifacts on everything, then you can't detect it anymore. The details that show the Photoshop disappear in compression artifacts.
Fafniiiir t1_j90f9ep wrote
I honestly don't think it'll matter in most cases.
Once it's out there it's too late.
It's like how when a newspaper puts out fake news almost no one will read the correction afterwards and the fake news will keep being perpetuated.
Viewing a single comment thread. View all comments