Submitted by [deleted] t3_zalbxu in deeplearning
normie1990 t1_iyma2hh wrote
Reply to comment by suflaj in Will I ever need more than 24GB VRAM to train models like Detectron2 and YOLOv5? by [deleted]
>Be as it be, using Pytorch itself, NVLink gets you less than 5% gains. Obviously not worth compared to 30-90% gains from a 4090.
Thanks, I think I have my answer.
Obviously I'm new to ML and didn't understand everything that you tried to explain (which I appreciate). I know that much - I will be freezing layers when fine-tuning, so from your earlier comment I guess I won't need more than 24GB.
Viewing a single comment thread. View all comments