Baeocystin t1_j9e6s12 wrote on February 21, 2023 at 7:06 AM Reply to comment by Last-Belt-4010 in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics The tl;dr for all GPU questions is that CUDA is the answer. There are no other even 'kinda' contenders. I'm not happy about the monopoly, but that's where we're at, and there is nothing on the horizon pointing otherwise, either. Permalink Parent 4
Baeocystin t1_j9e6s12 wrote
Reply to comment by Last-Belt-4010 in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
The tl;dr for all GPU questions is that CUDA is the answer. There are no other even 'kinda' contenders.
I'm not happy about the monopoly, but that's where we're at, and there is nothing on the horizon pointing otherwise, either.