Important_Put8366

Important_Put8366 t1_ismycko wrote

4090 now or wait for 4090ti?

I am interested in using and training stable diffusion models (specifically the recent Novel AI leak), so I need a new graphics card.

4090 has 24 gb vram and 4090ti, i heard, has 48 gb vram. It seems to me that getting 4090ti is much better because large language model and diffusion eats a lot of vram. I currently own an 1070, so I can do some generation but not training.

Anyone has any idea on when nvidia will release 4090ti? If I need to wait for another half a year, i might as well just get a 4090.

1