slashd t1_j6jhrth wrote
Reply to comment by neoplastic_pleonasm in OpenAI executives say releasing ChatGPT for public use was a last resort after running into multiple hurdles β and they're shocked by its popularity by steviaplath153
> 750GB
That easily fits on a $50 1TB ssd π
neoplastic_pleonasm t1_j6jk8gt wrote
Yep, now you only need a hundred thousand dollars more for a GPU cluster with enough VRAM to run inference with it.
NegotiationFew6680 t1_j6jmsiq wrote
Hahahaha
Now imagine how slow that would be.
Thereβs a reason these models are run on distributed clusters. A single query to ChatGPT is likely being processed by multiple GPUs across dozens of machines
gmes78 t1_j6k6myq wrote
You need to fit it in GPU VRAM. So go ahead and show me a consumer GPU with 750GB of VRAM.
Viewing a single comment thread. View all comments