Submitted by Stiven_Crysis t3_11e2wui in technology
RuairiSpain t1_jacukww wrote
Reply to comment by A-Delonix-Regia in PC GPU Shipments Drop 35% Year-over-Year in Q4 2022: Report by Stiven_Crysis
ChatGPT has 125 million vocabulary, to hold that in memory you'd need at least 1 80GB nVidia card, at $30,000 each. As AI models grow they'll need more RAM and Cloud is the cheapest way for companies to timeshare those prices.
It's not just training the models, it's also query the models that need that in memory calculations. I'm not expecting gamer to buy these cards. But scale up the number of using going to query OpenAI, Bing X ChatGPT or Google x Bard, and all the other AI competitors and there will be big demand for large RAM GPUs
Viewing a single comment thread. View all comments