Submitted by killver t3_y2vvne in MachineLearning
MohamedRashad t1_is68bbw wrote
The RTX 3090 is being sold now for as low as 1000$ ... I think it will be the best option for a lot of researchers here.
computing_professor t1_itnwsxg wrote
What about 2x3090 vs 1x4090? Cost vs. performance?
MohamedRashad t1_itnxalb wrote
The bigger VRAM (2x3090) is a better deal in my opinion and you get to distribute your training and make more experiments.
computing_professor t1_itnzkvv wrote
I guess it's sharable via nvlink. Usually a pair of GeForce cards can't combine vram.
Viewing a single comment thread. View all comments