Submitted by Maxerature t3_10ir1su in MachineLearning
I have 2 GPUs, an RTX 3080 and a GTX 1080Ti. Currently I am using only the 3080, and the 10 GB VRAM doesn't seem to cut it. Can I use both the 3080 and 1080 simultaneously? My motherboard has multiple PCI-E x16 slots. My OS is PopOS. Is there any way to use multiple GPUs of different types? I'm particularly looking at KoboldAI, but it would also be useful in general. I know that SLI won't work since they're different GPUs.
jess-plays-games t1_j5fy01j wrote
U could sli a 1080 with a previous card easily enough but they don't share ram they just use one cards vram.
The 2000 series and later don't support sli anymore and instead use nvlink witch does share vram between cards.
There a handy lil program called any sli I used to use