normie1990
normie1990 t1_iym8yq4 wrote
Reply to comment by suflaj in Will I ever need more than 24GB VRAM to train models like Detectron2 and YOLOv5? by [deleted]
I probably should have specified that I'll do fine tuning, not training from scratch, if that makes any difference.
​
>Memory pools are a software feature.
I know it's a software feature, AFAIK pytorch supports it, right?
normie1990 t1_iym8leb wrote
Reply to comment by suflaj in Will I ever need more than 24GB VRAM to train models like Detectron2 and YOLOv5? by [deleted]
I thought memory pooling was the whole point of NVLink?
​
>Tbose models already require more than 24GB RAM if you do not accumulate your gradients
Could you elaborate?
normie1990 OP t1_iyip6i2 wrote
Reply to comment by Mr_Ubik in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
This is what I have in mind for a case - https://www.corsair.com/us/en/Categories/Products/Cases/Crystal-Series-680X-RGB-High-Airflow-Tempered-Glass-ATX-Smart-Case/p/CC-9011168-WW
normie1990 OP t1_iyiiu26 wrote
Reply to comment by Mr_Ubik in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
No, I can't afford 4x 4090s. I'll start with a single 4090 with custom water loop and add a second 4090 in a year or so. From what I understand a high end gaming MB can support that, looking at the builds from mifcom.de
normie1990 OP t1_iyifg7t wrote
Reply to comment by Mr_Ubik in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
The plan is to dual boot windows and ubuntu. Benchmarks from lambda labs show that 3090 and a6000 are very close, are there any benchmarks with 4090?
normie1990 OP t1_iyiclwr wrote
Reply to comment by Mr_Ubik in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
Strictly for ML probably, but it will be my main PC so it needs to do gaming too.
normie1990 OP t1_iyi6fix wrote
Reply to comment by Ataru074 in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
Yes I think I will go with a Ryzen 9 platform with a single 4090 GPU. It's not very expandable like adding a ton of ram and multiple GPUs, but should be good enough for training detectron2 and yolo... I think. And cost way less than a threadripper platform.
normie1990 OP t1_iyi2ff9 wrote
Reply to comment by duschendestroyer in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
That is very helpful, thanks.
EDIT: So they are cooling a CPU and 4x 4090 with a 420mm and two 360mm radiators? And I've never seen radiators stacked like that, is it legal? lol
normie1990 OP t1_iyi2cpn wrote
Reply to comment by Ataru074 in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
It will also be my main workstation for coding, playing games, etc, I just want it to do AI as well :)
normie1990 OP t1_iyhja0s wrote
Reply to comment by suflaj in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
He meant that just one of the radiators needs to be higher
normie1990 OP t1_iyh422n wrote
Reply to comment by suflaj in [Discussion] Should I go with Threadripper 5000 and multi-GPU, or Ryzen 9 with single GPU? by normie1990
Sorry if this has been asked a lot, I'm new to this sub.
As for case I'm going for corsair 680X, it has room for a 360mm and a 240mm radiator. I'm not sure if I should put a radiator on the bottom? If yes, then an additional 240mm.
normie1990 t1_iyma2hh wrote
Reply to comment by suflaj in Will I ever need more than 24GB VRAM to train models like Detectron2 and YOLOv5? by [deleted]
>Be as it be, using Pytorch itself, NVLink gets you less than 5% gains. Obviously not worth compared to 30-90% gains from a 4090.
Thanks, I think I have my answer.
Obviously I'm new to ML and didn't understand everything that you tried to explain (which I appreciate). I know that much - I will be freezing layers when fine-tuning, so from your earlier comment I guess I won't need more than 24GB.