Atom_101 t1_is4b95t wrote
Reply to comment by M4xM9450 in [D] Are GAN(s) still relevant as a research topic? or is there any idea regarding research on generative modeling? by aozorahime
Diffusion is inherently slower than GANs. It takes N forward passes vs only 1 for GANs. You can use tricks to make it faster, like latent diffusion which does N forward passes with a small part of the model and 1 forward pass with the rest. But as a method diffusion is slower.
SleekEagle t1_is69o54 wrote
There are models that use continuous DE's rather than discrete iterations, both with Diffusion-adjacent methods like SMLD/NCSN and distinct methods!
Atom_101 t1_is6igij wrote
Just when I thought I understand the math behind DMs, they went ahead and added freaking DEs to it? Guess I should have paid more attention to math in college.
SleekEagle t1_is6trdz wrote
I'm really excited to see what the next few years bring. I've always felt like there's a lot of room for growth from higher level math, and it seems like that's beginning to happen.
I'll have a blog coming out soon on another physics-inspired model like DDPM, stay tuned!
puppet_pals t1_is5aad2 wrote
people are also running with distillation too
gwern t1_is5h3xj wrote
You can distill GANs too, though, so the performance gap remains, and applications may still be out of reach - distilling a GAN may get you realtime synthesis but distilling a diffusion still only typically makes it as fast as a GAN (maybe).
pm_me_your_ensembles t1_is5ppy3 wrote
It's entirely possible to do a lot better than N forward passes.
Atom_101 t1_is5z87v wrote
How so?
pm_me_your_ensembles t1_is6ttve wrote
Afaik self conditioning helps with the process, and there has been a lot of work in reducing the number of steps through distillation and quantization.
Atom_101 t1_is6wyys wrote
I see. Thanks!
Viewing a single comment thread. View all comments