Submitted by fayad-k t3_zfybux in singularity
blueSGL t1_ize8cdk wrote
Everyone is spinning around giddy with ChatGPT
>During the research preview, usage of ChatGPT is free.
I'd honestly not get too attached to relying on this thing when you don't know how much it will cost.
Charges to use previous models is no indication of how much this will cost as it seems to have some sort of memory.
Edit: Remember, Dalle2 where generations went from free to 'taking the piss' and it was not until several competitors came on the scene that they changed how much it cost.
stupsnon t1_izea4nw wrote
Everyone and their dog is making large models. Well, everyone who can build and train multimillion dollar models which is like 6 companies. The point is, it’s going to be intense competition. People can smell the future money, and that means it’s going to be very cheap or free for a long while as the dominant player emerges. As this is a platform play, there probably isn’t room for more than a couple winners in this market.
manOnPavementWaving t1_izea9bo wrote
There are easy and cheap ways to give models extra memory
GlobusGlobus t1_izenmke wrote
Realistically it will be paid in tiers. I need ChatGPT now, I want to pay. Right now it is too slow. I want to pay for a fast version.
big_cedric t1_izezrqy wrote
Computational efficiency is going to get improved so that it wouldn't be that expensive eventually. Oversized models lead to risks of overlearning and parroting
Viewing a single comment thread. View all comments