Comments
WhatTheZuck420 t1_je3ai7m wrote
make sure she's wearing combat boots tho
Low-Restaurant3504 t1_je3cqlc wrote
Peep toe combat boots. I have refined tastes.
OkRutabaga702 t1_je4odmf wrote
You could have photoshopped the pope with the same jacket and gotten the same response
[deleted] t1_je5ehfe wrote
[removed]
DweEbLez0 t1_je3uc6a wrote
And a puffy jacket
Western-Image7125 t1_je2pksv wrote
Jokes on you because he’s already been circulating those for a while now.
Low-Restaurant3504 t1_je2pwo7 wrote
Pics, or it didn't happen.
[deleted] t1_je5eigx wrote
[removed]
Known2779 t1_je44q7t wrote
It’s still not too late to jump on that influencers bandwagon. Or risk seeing urself in a puffy coat on the internet.
DefiantDragon t1_je74nbx wrote
Low-Restaurant3504
>Oh, the line is celebreties. People with money and influence. Gotcha. Not you or I. Our "betters". Just gonna go commission some deepfake porn of this guys mother and see how he feels then.
I'm going to make a deepfake of Jada Smith starring in GI Jane so that Will Smith will come to my house and slap me.
Low-Restaurant3504 t1_je7541a wrote
I want you in charge of everything.
DefiantDragon t1_je75rbz wrote
Low-Restaurant3504
>I want you in charge of everything.
Of course you do.
biggaywizard t1_je6mxt0 wrote
Make some porn of him and the pope and I'll buy it.
[deleted] t1_je5egi6 wrote
[removed]
packetofforce t1_je5tg92 wrote
Even if he actually meant it(which I doubt, his brain probably just automatically said "celebrities" due to context of the situation), It is way easier to make deepfakes with celebrities, than with average people, because celebrities have way more available data(photo, video, audio) on them than average people. It makes sense for the line to be celebrities, because deepfakes with average people is more technically difficult(availability of data), so chronologically hyper-real deepfakes with average people is further down the line, so by regulating at celebrities you also prevent deep fakes with average people. And wtf is your comment? The way you split hairs about his wording in such aggressive manner was weird. Try visiting a therapist.
Low-Restaurant3504 t1_je675u1 wrote
Please be quiet while the adults are talking. Thanks.
packetofforce t1_je89awc wrote
Your behavior is quite disappointing for someone who considers themselves an adult. By the way, https://bestonlinetherapyservices.com https://www.betterhelp.com/get-started/
Low-Restaurant3504 t1_je8a1dh wrote
Ooooh. Weaponizing mental health to win an online argument. Really not a whole lot lower you can go as a person. Hell, I find it distasteful, and if it's making me feel a bit icky, I can imagine how strongly that's gonna make others feel.
Be better. For real, man.
packetofforce t1_je8b9k2 wrote
Nope, I genuinely think that you need a therapist. It's not a joke. The way you got aggressive about wording was just weird. Particularly since it has come to light that you apparently are an adult.
Low-Restaurant3504 t1_je8bmp3 wrote
You do what you gotta do then, bud. You're a little too weird for me, however, so I'm just gonna block you and move on with my life.
Good luck with... whatever this is that you are going through.
Fastriverglide t1_je1xgtp wrote
Is there deepfake porn of EVERY celebrity yet?
Trout_Shark t1_je220fa wrote
Pretty much. At least all the current hot ones.
MiserableLychee t1_je3xhhc wrote
I want Alan Alda deepfakes
Fastriverglide t1_je46mhq wrote
Ok but hear me out - his face on Princess Leia's body xD
[deleted] t1_je4s6nr wrote
[removed]
Trout_Shark t1_je4f9vp wrote
Mom? I thought you said you would stay off reddit...
[deleted] t1_je5ek6w wrote
[removed]
Fastriverglide t1_je3so3m wrote
Hmm is the Pope hot to someone? Is there porn of Mohammed?
LiberalFartsMajor t1_je3u0o6 wrote
You just put that in the universe
Fastriverglide t1_je46f1d wrote
My legacy! <3
LiberalFartsMajor t1_je46kk6 wrote
I'm picturing the Pope and Muhammad giving each other handjobs under their robe / cloak
Fastriverglide t1_je46w6t wrote
That COULD be part of the interfaith dialogue. I mean who KNOWS WHAT WILL work in the end.
I'd much prefer to have them united under such circumstances rather than in the mouthfrothing hatred of atheists 🤔
G33ONER t1_je4hc4r wrote
The real tug of war
animatedrouge2 t1_je4pc5n wrote
What a bad day to be literate
Fastriverglide t1_je5ibeb wrote
Nice haha. Would you prefer it to be whispered into your ear?
[deleted] t1_je5emib wrote
[removed]
[deleted] t1_je5elqr wrote
[removed]
FuckOff555555 t1_je3vjrv wrote
that's how you get put on a hitlist
Fastriverglide t1_je46hxl wrote
Oh no I think I'm far too old for that :P
[deleted] t1_je5ekz0 wrote
[removed]
Asha108 t1_je3um0j wrote
of course lmao
[deleted] t1_je5enb0 wrote
[removed]
spike4379 t1_je449ng wrote
clint eastwood yet?
Glader t1_je4k2kz wrote
Gilbert Gottfried? Now that he's past on and become an ex-comedian he'll never be able to make anything real.
JamonRuffles17 t1_je4omdm wrote
........ link?? 👀 is there a sub for this with a full collection?
[deleted] t1_je5eny8 wrote
[removed]
aflarge t1_je2sxx4 wrote
So are they gonna ban using photoshop to doctor pictures of the unconsenting? They're being sensationalist idiots.
EmbarrassedHelp t1_je2tjg0 wrote
You joke, but I could see governments trying to pressure Adobe into adding AI to Photoshop that constantly scan what you are making in order to try and block things they don't like.
ozonejl t1_je3ay6c wrote
I’m in the Adobe Firefly beta and the content filters are pretty restrictive. Deleted what I thought were a couple innocuous words from my prompts AND I wouldn’t let me use “Michael Jackson.” To be fair, I was trying to make Michael Jackson at the karaoke bar with G.G. Allin, who apparently Adobe doesn’t know about.
[deleted] t1_je5eol1 wrote
[removed]
MetricVeil t1_je2w6cs wrote
Yep, they did something similar with photocopiers and paper currency, I believe.
[deleted] t1_je5epcc wrote
[removed]
WhatTheZuck420 t1_je3bhwp wrote
adobe: no problemo. we already scan in order to sell shit.
aflarge t1_je2v8pl wrote
Seems like a sure fire way to make sure Photoshop ceases to be an industry standard.
H3g3m0n t1_je4c5i7 wrote
There are copyrighted colors that Photoshop refuses to display without a $15 permonth subscription. Thanks Pantone.
Also Photoshop refuses to work on images of American currency.
aflarge t1_jed549p wrote
That's idiotic. That's like taking people to court because their picture of the night sky included the star you "own".
[deleted] t1_je3a23t wrote
[removed]
BobRobot77 t1_je3puh4 wrote
Well, the line should be drawn somewhere. I think sexual content of a non-consenting non-public figure is the line.
EnsignElessar t1_je55k6k wrote
Im not sure if they could even ban it at this point... its too later but something needs to be done. Otherwise our internet will be mostly bots, same deal with phone calls (probably already the case) but the scamming is about to get a whole lot more effective and scalable.
TheFriendlyArtificer t1_je2qiul wrote
How?
The neural network architectures are out in the wild. The weights are trivial to find. Generating your own just requires a ton of training data and some people to annotate. And that's assuming an unsupervised model.
I have a stripped down version of Stable Diffusion running on my home lab. It takes about 25 seconds to generate a single 512x512 image, but this is on commodity hardware with two GPUs from 2016.
If I, a conspicuously handsome DevOps nerd, can do this in a weekend and can deploy it using a single Docker command, what on earth can we do to stop scammers and pissant countries (looking at you, Russia)?
There is no regulating our way out of this. Purpose built AI processors will bring down the cost barrier even more substantially. (Though it is pretty cool to be able to run NN inferences on a processor architecture that was becoming mature when disco was still cool)
Edit: For the curious, the repo with the pre-built Docker files (not mine) is https://github.com/NickLucche/stable-diffusion-nvidia-docker
DocHoss t1_je3f5k0 wrote
You really are very handsome! And really smart too.
You want to share that Docker command for a poor, incompetent AI dabbler?
Did I mention you are very handsome and smart?
BidetAllDay t1_je3jm3n wrote
Dockers…Nice Pants!
TheFriendlyArtificer t1_je5iqme wrote
Edited my original content.
Not my repo, but it works like a charm in Debian 11 with two nVidia 2080s.
https://github.com/NickLucche/stable-diffusion-nvidia-docker
lucidrage t1_je3zmti wrote
What's your dockerfile setup, you incredibly handsome devops engineer? I could never get the docker container to recognize my gpu on windows...
[deleted] t1_je5er9b wrote
[removed]
TheFriendlyArtificer t1_je5iu1d wrote
Enjoy! It's not my repo, but the author has done a good job with documentation.
https://github.com/NickLucche/stable-diffusion-nvidia-docker
NamerNotLiteral t1_je533mj wrote
I only see one way to regulate models whose weights are public already.
Licenses hard-built into the GPU itself, through driver code or whatever. Nvidia and AMD can definitely do this. When you load the model into the GPU, they could check the exact weights, and if it's a 'banned' model they could shut it down.
Most of these models are too large for individuals to train from scratch, so you'd only need to ban the weights floating around. Fine tuning isn't possible either, since you need to load the original model first before you fine-tune it.
Yes, there would be ways to circumvent this, speaking as a lifelong pirate. But it's something that could be done by Nvidia, and would immediately massively increase the barrier to entry.
[deleted] t1_je5es16 wrote
[removed]
[deleted] t1_je379g9 wrote
[removed]
[deleted] t1_je5eqmf wrote
[removed]
Trip-trader t1_je3ho92 wrote
Making deepfakes is one thing, sharing them with the internet and millions of people is another. Damn straight you can regulate the crap out of anything. Go ask the EU.
Call-Me-Robby t1_je3y5w0 wrote
As the war on drugs showed us, there’s a very wide gap between laws and their enforcement.
FuckOff555555 t1_je3vsfr wrote
the easiest way would be to force nvidia, amd, intel, and apple to not allow AI training on consumer hardware
SwagginsYolo420 t1_je46oaa wrote
The hardware is already out there though.
Also it would be a terrible idea to have an entire new emerging technology only in the hands of the wealthy. That's just asking for trouble.
It would be like saying regular hardware shouldn't be allowed to run photoshop or a spreadsheet or word processor because somebody might do something bad with it.
People are going to have to learn than images and audio and video can be faked, just like they have to learn that an email from a Nigerian price is also a fake.
There's no wishing this stuff away, the cat is already out of the bag.
Glittering_Power6257 t1_je49f6f wrote
As Nvidia had fairly recently learned, blockades from running certain algorithms will be circumvented. Many applications also use GPGPU for acceleration of non-graphics applications (GPUs are pretty much highly parallel supercomputers on a chip), so cutting off GPGPU is not on the table either. Unless you wish to just completely screw over the open source community, and go the white list route.
Special_Function t1_je430im wrote
You could have photoshopped the pope with the same jacket and gotten the same response.
EnsignElessar t1_je55ao5 wrote
True but this is way easier. For a long time the ability to do this has been there but its been a hard skill to learn how to do. Now just from your phone you can type in. "Pope with a weird coat." And it will be created for you. Some other things to consider is... well its text to image today sure but tomorrow it will be text to video, and then you combine that with text to audio. So now a single person not a studio can easily make a fake of anyone saying anything you like.
workworkworkworky t1_je5ox9l wrote
Well, if it gets that easy, these things will be everywhere and everyone will just get used to them.
Several-Duck6956 t1_je9l7y7 wrote
You think you can get used to seeing a video of your mom’s gaping b hole being filled up by a rando? Like, ever? Or perhaps, your face being glued onto some of the sickest porn shit ever created? Is that something humans get used to? How long does it take?
[deleted] t1_je5phpy wrote
[removed]
[deleted] t1_je5ejdc wrote
[removed]
[deleted] t1_je4l7ai wrote
[deleted]
Myrkull t1_je4y8zy wrote
Why is that 'the issue'?
MetricVeil t1_je2x69a wrote
Rule 34. Plus, fakes of celebrities has been around for almost as long as the internet has. The only difference is that the quality of fakes has improved, exponentially.
To be honest, porn is a major factor on whether some technologies are taken up by the masses. :D
EnsignElessar t1_je566x6 wrote
No not the only thing that has changed. Its also very scalable. Even a single individual could launch a massive disinformation from their smartphone.
cinemachick t1_je31zzi wrote
On the one hand, we are definitely on the edge of a world where anything can be faked. On the other hand, we've been down this road before: Photoshop, "realistic" CGI, dodging and burning pinup prints, the fairy photograph hoaxes of the early 1900s, etc. We learn and adapt to changes incrementally, not everyone and not all at once, but we get there eventually. And let's be honest, misinformation has been in place in the media for years - the sinking of the Lusitania was completely fabricated to create justification for war, way before anyone had AI or Photoshop. It all comes down to who the source is and their credibility, has been since the dawn of the written word.
(But tbf, I'm in an industry that will be hit hard by AI so I understand the panic!)
ozonejl t1_je3bh7o wrote
Good to see a reasonable person who doesn’t just see a threat to their job and freaks out. New technology always comes with the same concerns and challenges. I’m kinda like…people already fall for loads of obviously, transparently fake shit on Facebook, but somehow this is gonna be so much worse?
RayTheGrey t1_je4adys wrote
Its the ease and speed of it that might be the difference.
EnsignElessar t1_je55ta5 wrote
Yes it will be worse. Because of scale. Instead of having to have an expert sit there making fakes and trying to spread them. You can automate most of that.
bobnoski t1_je55nlf wrote
the ease, speed, and accuracy of it. It's now possible to, within minutes of a live video being broadcasted. Use deep fake and AI voice generation to modify a video of a world leader. It doesn't have be something where the entire video is faked or edited, but say. edit a world leader saying "we will support Ukraine" to "we will no longer support Ukraine". Set it on blast, or in more repressive regimes run it as if that's the live view and you're going to have a way more difficult task of disproving this than an article that says "this world leader said this thing"
The more realistic, multi-faceted and abundant fakes are. the higher the chances are that people no longer trust the real thing.
[deleted] t1_je5ayw9 wrote
[removed]
[deleted] t1_je5espa wrote
[removed]
almightySapling t1_je6kea4 wrote
I'm not worried about deepfake images, audio, or video.
I'm worried about deepfaked websites. I want to know that when I go to Associated Press, or Reddit, that I'm actually seeing that site with content sources from the appropriate avenues.
I do not want to live in a walled garden of my internet provider's AI, delivering me only the Xfinity Truth.
[deleted] t1_je4t6kl wrote
[removed]
EnsignElessar t1_je55vta wrote
Well its about scale and that makes a large difference.
rsta223 t1_je8d9kn wrote
>the sinking of the Lusitania was completely fabricated
No, it was a real ship that was genuinely sunk by an actual German U-boat.
KillBoxOne t1_je2798g wrote
Regulation? How about you just don’t do it? It’s like he is saying “I did it because the government didn’t stop me”!
Edit: I get the larger need for regulation. It just funny how the guy who did it gets caught then pivots to saying more regulation is needed.
popthestacks t1_je2mk78 wrote
Why not do it? There’s no need for regulation. That approach is ridiculous. The technology exists, live with it.
Low-Restaurant3504 t1_je2mrfh wrote
Big ol endorse.
[deleted] t1_je5evis wrote
[removed]
Better_Path5755 t1_je2qb2v wrote
the cat's outta the bag, morality is mostly a human construct, if someone can do something whether its right or wrong then best believe they will. i'm with you though as an artist
[deleted] t1_je3cw5o wrote
[deleted]
MetricVeil t1_je2wiu5 wrote
Yeah, that approach has really worked for robbery, murder, hacking... and all the other things people shouldn't do. :D
seamustheseagull t1_je4ecal wrote
It's fairly common for someone to make a demonstration of a power in order to prove the need to regulate it.
Whether or not he did this deliberately, the fact that the image has gained so much attention has obviously made him realise the danger here and now he's using his brief new platform to try and highlight that danger. I don't see the issue.
[deleted] t1_je5ewcf wrote
[removed]
EnsignElessar t1_je56b7r wrote
Ok so I won't use it then I just sit here and hope no one else does?
os12 t1_je2hlo6 wrote
Why would we want to involve government in regulating the means of making these images? The artists are free to draw and publish what they like... so, how is this different?
NoiceMango t1_je3whwy wrote
It's different when it's meant to impersonate someone.
[deleted] t1_je5ex76 wrote
[removed]
os12 t1_je5dzmn wrote
I fail to see a point. Anyone can write prose and try to impersonate a writer. Or paint and try to impersonate a painter. Or program and try to impersonate a software firm.
None of that is regulated.
NoiceMango t1_je66d80 wrote
Impersonating someone in a very accurate way is much different. Try seeing harder.
RayTheGrey t1_je4aoua wrote
A single person could conceivably outproduce thousands of artists drawing/photoshoping images. And to verify whether something is true or not, you need people.
I'm not sure if anything can be done about it, but the sheer volume of content enabled by generative models is a little concerning.
EnsignElessar t1_je56fv1 wrote
Artists are much more expensive to hire.
os12 t1_je5dhwn wrote
Sure and why does this kind of democratization call for government regulation?
EnsignElessar t1_je5e5vc wrote
Ok so Im just a regular guy but I have two ideas for how this all ends very badly for most people. One is automated scamming. Before you needed a call center in india or somewhere which could be pretty expensive. If you wanted to scale you had to hire people which took time. But now you can just do it all on your own. The second issue is with just a prompt. "Create me a video of Biden announcing why he has just launched a tactical nuke on russia." Oh boy so even if we all just don't believe in that it would cause other issues like not believing anything you see or read... I mean you don't think these are issues?
os12 t1_je5f17b wrote
Yes, this kind of crap is the inevitable byproduct of new tech. That happens every time a new thing is created.
The good news is that companies will create similar tech to filter out the shit. Just think about scam email - this is the same thing at another level.
EnsignElessar t1_je5gm08 wrote
Oh these are just the surface level issues. The more you dig the more you realize that things are looking bad. We don't have a real plan for ai safety and it will likely end us a result. This can't be one of those things where we get it wrong and improve later. Because if we fail just once, no more people to learn from the mistake.
[deleted] t1_je5eytj wrote
[removed]
TiredOldLamb t1_je42itm wrote
That gif of the pope doing a trick in front of the bishops is like 10 years old at this point and it's beyond hilarious, much better than the puffy coat. But now that they are using an AI it's crossing the line?
EnsignElessar t1_je561dt wrote
Its super easy to use. Even a child can type: "Pope in funny jacket." Then you can use other ai solutions to further spread disinformation.
[deleted] t1_je5eu56 wrote
[removed]
Excellent-Wishbone12 t1_je36m4r wrote
Whatever Lars
[deleted] t1_je5euvl wrote
[removed]
Cheshire1871 t1_je3qkh9 wrote
Why? They photoshop themselves beyond recognition. Those are fake, how is this different. They are both digitally altered. So no more photoshop?
[deleted] t1_je5f0zg wrote
[removed]
Troy-aka-Troy t1_je36v9k wrote
Fuck him, he’s fair game
GonnaGoFar t1_je2jtr0 wrote
Honestly, at this point, it seems like deep fake porn of celebrities and regular women is inevitable.
How can we stop that? Seriously.
BroForceOne t1_je33f11 wrote
Won't someone please think of the poor celebrities?!
KRA2008 t1_je3ajdn wrote
i'm sure i'm not the first to say it, but i think i'm going to go ahead and study oil painting for the next 50 years, so that the neural network in my head can create images just like this. if that doesn't work out i'll use photoshop to do the exact same thing. am i illegal?
In-Cod-We-Thrust t1_je44jn8 wrote
Every day I plead. I beg. I raise my voice to the Gods of all the heavens; “Please… just flood it one more time.”
EnsignElessar t1_je56ja7 wrote
/r/ControlProblem
Careful what you ask for.
AdGiers t1_je48msx wrote
Regulate what exactly?
EnsignElessar t1_je57unx wrote
Ai development.
AdGiers t1_je5g27k wrote
How exactly, bearing in mind the powers at be can barely regulate anything digital such as piracy and crypto?
EnsignElessar t1_je5gak9 wrote
I actually don't think they will be willing/ able to do it. Its a hard problem to solve. But I never give up so I am asking anyway.
[deleted] t1_je5f09w wrote
[removed]
[deleted] t1_je2vox1 wrote
[removed]
[deleted] t1_je3cm4t wrote
[removed]
[deleted] t1_je3yq3k wrote
[deleted]
Biyeuy t1_je428ye wrote
Let‘s use it in Ukraine war to defeat soviet fascists. Otherwise putin takes the leader position in KI-powered warfare.
[deleted] t1_je42puk wrote
[deleted]
MensMagna t1_je439q9 wrote
How could anyone even think that image of the pope is real?
Neiko_R t1_je44v03 wrote
I don't know what people are trying to do by "regulation", these AI's are already open source and they can be used by anyone
[deleted] t1_je5f46m wrote
[removed]
MeloveTHICCbootay t1_je46ggb wrote
regulate this regulate that. How about you stop trying to have government regulate everything in our fucking lives. for fucks sake. fuck off.
seamustheseagull t1_je4dypk wrote
It's about a decade ago now that I first recall conversations about this problem.
Back then we knew this was going to happen.
And there are many solutions to this problem which existed back then, including the use of digital signing for images and videos to verify when they were produced and by whom.
We've had at least a decade to prepare for this and nobody in the media or tech sectors have been bothered doing fucking anything.
So now we get a couple of years of pure chaos as fake images get produced which are virtually indistinguishable from reality, and everyone is scrambling to put measures in place to fix this.
[deleted] t1_je5f4w4 wrote
[removed]
G33ONER t1_je4h8lq wrote
The Pope looked Dope, has he made a statement?
Glader t1_je4kcma wrote
Can someone please take Gilbert Gottfrieds "The Aristocrats" clip, speech-to-text it and feed it to an artist AI?
The_DashPanda t1_je4r5ru wrote
In this age of hyper-consumerist "late-stage" capitalism, the head of a world religion wearing expensive elitist clothes might just be too believable for some to distinguish artistic expression from visual record, but I'm sure they'll just blame the technology and push for censorship.
[deleted] t1_je5f3gk wrote
[removed]
banterism t1_je4uvgk wrote
We draw the line at puffy jackets, this is an outrage! Lol
[deleted] t1_je5f5m5 wrote
[removed]
H809 t1_je4zp9t wrote
Look famous people, AI is dangerous for your imagine, we need better regulation because if it’s dangerous for your almighty individuals, it’s bad for humanity. Fucking simp.
Jman1a t1_je55p69 wrote
“I am a man of the cloth a servant of God!” sits on his solid gold throne
Glissssy t1_je5hk3i wrote
I can't really see how that would be "the line" given this is just automated photoshopping, the kind of thing that has been a pasttime online for many years.
ForeignSurround7769 t1_je7ioxu wrote
Wouldn’t it be an easy fix to make laws against using AI to impersonate anyone? We should all have a right to own our face and voice. That seems simple enough to regulate as well. It won’t stop all of it but will be better than nothing.
nadmaximus t1_je4870d wrote
How? Also...the pope is a celebrity?
Low-Restaurant3504 t1_je1yy1q wrote
Oh, the line is celebreties. People with money and influence. Gotcha. Not you or I. Our "betters". Just gonna go commission some deepfake porn of this guys mother and see how he feels then.