Viewing a single comment thread. View all comments

Clawz114 t1_j56nnoj wrote

>Because GPT-3 was trained on almost all publicly available data

GPT-3 was trained with around 45TB of data, which is only around 10% of the common crawl database that makes up 60% of GPT3's training dataset.

>Especially as the global population is shrinking and most people are already connected online so not a lot of new data is made.

The global population is growing and expected to continue growing until just over the 10 billion mark?

4