Submitted by _underlines_ t3_zstequ in MachineLearning
IWantAGrapeInMyMouth t1_j1b1vl8 wrote
Reply to comment by coolbreeze770 in [D] When chatGPT stops being free: Run SOTA LLM in cloud by _underlines_
I imagine there’ll be open source versions of ChatGPT in the near future given it’s wild popularity, I’ll probably just use that for personal projects, and in a business setting I would just have a dedicated model of that open source version running. .004 cents per 1000 tokens (or much less) is a hell of an ask if you’re doing anything where users generate tokens
sanman t1_j1bakvl wrote
Open Source is only free when it's running off your own computer. Otherwise, if it's running off some infrastructure, then that has to be paid for - typically with ads or something like that.
IWantAGrapeInMyMouth t1_j1bzofo wrote
Usually inference on hugging face for large models is free for individuals making a reasonable amount of API calls as part of their offerings, and I assume an open source version of this would be on there. I realize that it costs money.
Viewing a single comment thread. View all comments