Viewing a single comment thread. View all comments

Surur t1_j9ntwmv wrote

> I know that the computing power necessary for the most successful models far outstrip what your average consumer is capable of generating.

The training is resource intensive. The running is not, which is demonstrated by ChatGPT being able to support millions of users concurrently.

Even if you need a $3000 GPU to run it, that's a trivial cost for the help it can provide.

3

LettucePrime OP t1_j9nv0b8 wrote

Ehh no actually, that's not true. ChatGPT inferences are several times more expensive than your typical Google search, & utilize the same hardware resources used to train the model, operating at the same intensity, it seems.

1

Surur t1_j9nv53o wrote

That's not what I said lol. I said its manageable on hardware a consumer can buy.

3