Viewing a single comment thread. View all comments

Suolucidir t1_jec3awc wrote

I am not sure about AI becoming self aware, but I see a lot of anxiety in the community about billionaires being the only people in control of these models and I want to address that issue a little bit.

The fact is that GPT-4 is amazing and not open source. So it is true that you cannot run it yourself. However, it is not inaccessible and you can use it for free or pay to use it on upgraded hardware with more memory on a pay-as-you-go model - so it is certainly accessible for regular people.

With that said, GPT-4 is not the only game in town. For example, Bloom is an open source alternative that is routinely viewed as comparable to GPT-3.5(and better in some cases, depending on what you are asking for). There are a few other models that get very close to GPT-3 performance that are open source too, like EleutherAI's GPT-NeoX-20B model.

Anyway, Bloom is free to download, use, and even modify for anybody. You might be thinking "Yeah, well how am I supposed to afford to run a model trained on 167 Billion parameters?"

And that is a reasonable thought. The answer is that you probably cannot afford to run it yourself. Here is an example of the hardware you would need to buy: https://shop.lambdalabs.com/deep-learning/servers/blade/customize (At 8x A100 GPUs it's just over $150,000). However, 10 people could go in together with $15000 apiece and then it's cheaper than any new car (and it's likely you would never run into each other, HUGE university departments share this kind of hardware effectively).

Alternatively, this guy did it for $32/hour using Amazon's cloud: https://medium.com/mlearning-ai/bloom-176b-how-to-run-a-real-large-language-model-in-your-own-cloud-e5f6bdfb3bb1

Here is a link to the actual model if anybody wants to really do this: https://huggingface.co/bigscience/bloom

1