[D] Running large language models on a home PC? Submitted by Zondartul t3_zrbfcr on December 21, 2022 at 5:29 AM in MachineLearning 41 comments 86
Final-Rush759 t1_j12zqjw wrote on December 21, 2022 at 7:56 AM Model parallelism. But you need more than 1 card. Buy A6000 which has 48 GB vram. Permalink 4
Viewing a single comment thread. View all comments