[N] GPT-4 has 1 trillion parameters the-decoder.com Submitted by mrx-ai t3_121q6nk on March 25, 2023 at 3:58 PM in MachineLearning 9 comments 0
frequenttimetraveler t1_jdo9gw5 wrote on March 25, 2023 at 10:10 PM Altman did not say anything about that in Lex Fridman show. He said the 100T rumor was just a meme How would run time scale with parameter size? Can we infer if 1T is true from the latency of the responses? Permalink 2
Viewing a single comment thread. View all comments