Knurpel
Knurpel t1_ird8vtd wrote
Reply to comment by MyActualUserName99 in Deeplearning and multi-gpu or not by ronaldxd2
>extremely easy
... depends on your coding proficiency. As I said, "it's not as easy as sticking in another GPU."
Knurpel t1_ir90nkx wrote
Reply to Deeplearning and multi-gpu or not by ronaldxd2
Assuming that your deep learning stack uses CUDA: Multi-GPU CUDA is not for the faint of heart, and most likely will requre intense code wrangling on your part. It's not as easy as sticking in another GPU
Your GPUs support the outgoing NVLINK, and using it would make things easier on you.
Knurpel t1_ireu5b5 wrote
Reply to comment by GPUaccelerated in Deeplearning and multi-gpu or not by ronaldxd2
Ooops. I'm only familiar w/ the 90