Surur t1_j9ntwmv wrote
Reply to comment by LettucePrime in Question for any AI enthusiasts about an obvious (?) solution to a difficult LLM problem in society by LettucePrime
> I know that the computing power necessary for the most successful models far outstrip what your average consumer is capable of generating.
The training is resource intensive. The running is not, which is demonstrated by ChatGPT being able to support millions of users concurrently.
Even if you need a $3000 GPU to run it, that's a trivial cost for the help it can provide.
LettucePrime OP t1_j9nv0b8 wrote
Ehh no actually, that's not true. ChatGPT inferences are several times more expensive than your typical Google search, & utilize the same hardware resources used to train the model, operating at the same intensity, it seems.
Surur t1_j9nv53o wrote
That's not what I said lol. I said its manageable on hardware a consumer can buy.
LettucePrime OP t1_j9nw9fu wrote
I understand you now, my apologies.
Surur t1_j9nwi3k wrote
Sure, NP, and you are partially right also lol. It may cost closer to $80,000 to have your own ChatGPT instance.
https://twitter.com/tomgoldsteincs/status/1600196988703690752
But then that sounds like a business opportunity lol.
Viewing a single comment thread. View all comments