Buck-Nasty t1_j9hb9az wrote
Reply to comment by Dankbubbles123 in A German AI startup just might have a GPT-4 competitor this year. It is 300 billion parameters model by Dr_Singularity
Gpt4s parameter counts aren't known yet
Dankbubbles123 t1_j9hbbp1 wrote
Ah okay, nvm then. Sorry
Buck-Nasty t1_j9hch2c wrote
The context window is apparently massive though, more than 10 times the size of gpt3, it could potentially write whole novels at that scale
https://mobile.twitter.com/transitive_bs/status/1628118163874516992?s=46&t=Biiqy66Cy9oPH8c1BL6_JQ
hydraofwar t1_j9hgim5 wrote
A credible researcher had commented that ChatGPT can write code, and GPT-4 could write entire programs.
GPT-5entient t1_j9hk7td wrote
32k tokens would mean approximately 150 kB of text. That is a decent sized code base! Also with this much context memory the known context saving tricks would work much better so this could be theoretically used to create code bases of virtually unlimited size.
This amazes me and also (being software dev) also scares me...
But, as they say, what a time to be alive!
GPT-5entient t1_j9hji5i wrote
Wow, yeah, this looks amazing. My biggest issue with GPT-3 is the relatively small context window. This will open so many new possibilities.
Practical-Mix-4332 t1_j9hg2cr wrote
Is anything about gpt4 known? It seems like just a bunch of rumors and not even a release date
Midnight-Movie t1_j9hv0t7 wrote
>Is anything about gpt4 known? It seems like just a bunch of rumors and not even a release date
I work with someone who has Beta access to GPT-4. He won't tell me much other than it's mind-blowing & that software development will never be the same. He confirms the rumors that it indeed can write an entire piece of software.
farcetragedy t1_j9hzfq1 wrote
That’s exciting. Would be amazing if the next one didn’t just make shit up when it doesn’t know the answer
Practical-Mix-4332 t1_j9hxkf3 wrote
Oh great another rumor
Midnight-Movie t1_j9hy4uz wrote
Well... You asked if anything was known. I gave you info from a coworker with beta access. My apologies if my info didn't come with a boquet of roses and a handwritten card.
Practical-Mix-4332 t1_j9i0ctk wrote
I understand you’re trying to help, but this being Reddit and all there’s no way we can trust what you are saying or take it officially as something “known”. No offense though.
MysteryInc152 t1_j9j9dvt wrote
32k context window it seems.
https://mobile.twitter.com/transitive_bs/status/1628118163874516992?s=20
GPT-5entient t1_j9hkupz wrote
There was that very popular but completely unfounded rumor about 100T param count. It was debunked by Sam Altman himself.
If you think about it for just 1 second you would realize that 100T param model would need at least 200 TB of VRAM or 2560 Nvidia A100s...
Viewing a single comment thread. View all comments