Viewing a single comment thread. View all comments

TheFriendlyArtificer t1_je2qiul wrote

How?

The neural network architectures are out in the wild. The weights are trivial to find. Generating your own just requires a ton of training data and some people to annotate. And that's assuming an unsupervised model.

I have a stripped down version of Stable Diffusion running on my home lab. It takes about 25 seconds to generate a single 512x512 image, but this is on commodity hardware with two GPUs from 2016.

If I, a conspicuously handsome DevOps nerd, can do this in a weekend and can deploy it using a single Docker command, what on earth can we do to stop scammers and pissant countries (looking at you, Russia)?

There is no regulating our way out of this. Purpose built AI processors will bring down the cost barrier even more substantially. (Though it is pretty cool to be able to run NN inferences on a processor architecture that was becoming mature when disco was still cool)

Edit: For the curious, the repo with the pre-built Docker files (not mine) is https://github.com/NickLucche/stable-diffusion-nvidia-docker

46

DocHoss t1_je3f5k0 wrote

You really are very handsome! And really smart too.

You want to share that Docker command for a poor, incompetent AI dabbler?

Did I mention you are very handsome and smart?

15

lucidrage t1_je3zmti wrote

What's your dockerfile setup, you incredibly handsome devops engineer? I could never get the docker container to recognize my gpu on windows...

3

NamerNotLiteral t1_je533mj wrote

I only see one way to regulate models whose weights are public already.

Licenses hard-built into the GPU itself, through driver code or whatever. Nvidia and AMD can definitely do this. When you load the model into the GPU, they could check the exact weights, and if it's a 'banned' model they could shut it down.

Most of these models are too large for individuals to train from scratch, so you'd only need to ban the weights floating around. Fine tuning isn't possible either, since you need to load the original model first before you fine-tune it.

Yes, there would be ways to circumvent this, speaking as a lifelong pirate. But it's something that could be done by Nvidia, and would immediately massively increase the barrier to entry.

2

Trip-trader t1_je3ho92 wrote

Making deepfakes is one thing, sharing them with the internet and millions of people is another. Damn straight you can regulate the crap out of anything. Go ask the EU.

0

Call-Me-Robby t1_je3y5w0 wrote

As the war on drugs showed us, there’s a very wide gap between laws and their enforcement.

11

FuckOff555555 t1_je3vsfr wrote

the easiest way would be to force nvidia, amd, intel, and apple to not allow AI training on consumer hardware

−9

SwagginsYolo420 t1_je46oaa wrote

The hardware is already out there though.

Also it would be a terrible idea to have an entire new emerging technology only in the hands of the wealthy. That's just asking for trouble.

It would be like saying regular hardware shouldn't be allowed to run photoshop or a spreadsheet or word processor because somebody might do something bad with it.

People are going to have to learn than images and audio and video can be faked, just like they have to learn that an email from a Nigerian price is also a fake.

There's no wishing this stuff away, the cat is already out of the bag.

10

Glittering_Power6257 t1_je49f6f wrote

As Nvidia had fairly recently learned, blockades from running certain algorithms will be circumvented. Many applications also use GPGPU for acceleration of non-graphics applications (GPUs are pretty much highly parallel supercomputers on a chip), so cutting off GPGPU is not on the table either. Unless you wish to just completely screw over the open source community, and go the white list route.

4