[R] ChatGLM-6B - an open source 6.2 billion parameter Eng/Chinese bilingual LLM trained on 1T tokens, supplemented by supervised fine-tuning, feedback bootstrap, and RLHF. Runs on consumer grade GPUs github.com Submitted by MysteryInc152 t3_11utpud on March 18, 2023 at 5:01 PM in MachineLearning 49 comments 201
evangelion-unit-two t1_jcu2fw9 wrote on March 19, 2023 at 3:54 PM Reply to comment by gkaykck in [R] ChatGLM-6B - an open source 6.2 billion parameter Eng/Chinese bilingual LLM trained on 1T tokens, supplemented by supervised fine-tuning, feedback bootstrap, and RLHF. Runs on consumer grade GPUs by MysteryInc152 Tankie detected Permalink Parent 1 [deleted] t1_jcyxnc3 wrote on March 20, 2023 at 4:57 PM [removed] Permalink Parent 1
Viewing a single comment thread. View all comments