FuckOff555555 t1_je3vsfr wrote
Reply to comment by TheFriendlyArtificer in The guy behind the viral fake photo of the Pope in a puffy coat says using AI to make images of celebrities 'might be the line' — and calls for greater regulation by Lakerlion
the easiest way would be to force nvidia, amd, intel, and apple to not allow AI training on consumer hardware
SwagginsYolo420 t1_je46oaa wrote
The hardware is already out there though.
Also it would be a terrible idea to have an entire new emerging technology only in the hands of the wealthy. That's just asking for trouble.
It would be like saying regular hardware shouldn't be allowed to run photoshop or a spreadsheet or word processor because somebody might do something bad with it.
People are going to have to learn than images and audio and video can be faked, just like they have to learn that an email from a Nigerian price is also a fake.
There's no wishing this stuff away, the cat is already out of the bag.
Glittering_Power6257 t1_je49f6f wrote
As Nvidia had fairly recently learned, blockades from running certain algorithms will be circumvented. Many applications also use GPGPU for acceleration of non-graphics applications (GPUs are pretty much highly parallel supercomputers on a chip), so cutting off GPGPU is not on the table either. Unless you wish to just completely screw over the open source community, and go the white list route.
Viewing a single comment thread. View all comments