MorallyDeplorable t1_jc32jfw wrote
Reply to comment by 3deal in [P] Discord Chatbot for LLaMA 4-bit quantized that runs 13b in <9 GiB VRAM by Amazing_Painter_7692
It should, yea. I'm running it on a 4090 which has the same amount of VRAM. It takes about 20-21 GB of RAM.
3deal t1_jc32o55 wrote
Cool, it is sad here is no download link to try it 🙂
Viewing a single comment thread. View all comments