neoplastic_pleonasm t1_j6j72be wrote
Reply to comment by EmbarrassedHelp in OpenAI executives say releasing ChatGPT for public use was a last resort after running into multiple hurdles β and they're shocked by its popularity by steviaplath153
The ChatGPT model is in the neighborhood of 750GB, so sadly we won't be seeing anything remotely as capable that can run on consumer hardware any time soon.
EmbarrassedHelp t1_j6lhtho wrote
Where are you getting the file size from?
slashd t1_j6jhrth wrote
> 750GB
That easily fits on a $50 1TB ssd π
neoplastic_pleonasm t1_j6jk8gt wrote
Yep, now you only need a hundred thousand dollars more for a GPU cluster with enough VRAM to run inference with it.
NegotiationFew6680 t1_j6jmsiq wrote
Hahahaha
Now imagine how slow that would be.
Thereβs a reason these models are run on distributed clusters. A single query to ChatGPT is likely being processed by multiple GPUs across dozens of machines
gmes78 t1_j6k6myq wrote
You need to fit it in GPU VRAM. So go ahead and show me a consumer GPU with 750GB of VRAM.
Viewing a single comment thread. View all comments