Viewing a single comment thread. View all comments

TheTomatoBoy9 t1_j29s315 wrote

The expectations for it to be free are with the current version. Subsequent versions will easily be marketed as premium and sold through subscriptions.

Then, this doesn't even address the whole business market where expensive licenses can be sold.

Finally, they are bankrolled by Microsoft, among others. Eye watering costs are only eye watering to small startups. It's not much of a problem when the company backing you is sitting on $110 billion in cash.

In the tech world, you can lose money for years if you can sell a good growth story. Especially with backers like Microsoft.

3

visarga t1_j2bi28f wrote

I expect in the next 12 months to have an open model that can rival chatGPT and runs on more accessible hardware, like 2-4 GPUs. There's a lot of space to optimise the inference cost. Flan-T5 is a step in that direction.

I think the community trend is to make small efficient models that rival the original, but run on local hardware in privacy. For now, the efficient versions are just 50% as good as GPT-3 and chatGPT.

2