Dankbubbles123 t1_j9hbbp1 wrote
Reply to comment by Buck-Nasty in A German AI startup just might have a GPT-4 competitor this year. It is 300 billion parameters model by Dr_Singularity
Ah okay, nvm then. Sorry
Buck-Nasty t1_j9hch2c wrote
The context window is apparently massive though, more than 10 times the size of gpt3, it could potentially write whole novels at that scale
https://mobile.twitter.com/transitive_bs/status/1628118163874516992?s=46&t=Biiqy66Cy9oPH8c1BL6_JQ
hydraofwar t1_j9hgim5 wrote
A credible researcher had commented that ChatGPT can write code, and GPT-4 could write entire programs.
GPT-5entient t1_j9hk7td wrote
32k tokens would mean approximately 150 kB of text. That is a decent sized code base! Also with this much context memory the known context saving tricks would work much better so this could be theoretically used to create code bases of virtually unlimited size.
This amazes me and also (being software dev) also scares me...
But, as they say, what a time to be alive!
GPT-5entient t1_j9hji5i wrote
Wow, yeah, this looks amazing. My biggest issue with GPT-3 is the relatively small context window. This will open so many new possibilities.
Viewing a single comment thread. View all comments