Viewing a single comment thread. View all comments

Tuggummii t1_j3kzxy6 wrote

Unfortunately I have not enough knowledge to answer that question.

3

learningmoreandmore OP t1_j3l17yp wrote

No problem! Thanks for the insight regarding its capability and costs

2

Nmanga90 t1_j3xxwj6 wrote

Locally will not cut it unless you have a high performance computer with lab grade GPUs for inference. The reason the AI models are so expensive to use is because they are actually pretty expensive to run. They are running probably 2 parallel versions of the model on a single a100, and have likely duplicated this architecture 10,000 times. And an a100 is 10 grand used, 20 grand new. You can also rent them out for about $2 per minute.

1