Viewing a single comment thread. View all comments

Dankbubbles123 t1_j9hb2j2 wrote

Eh, doesn’t chat gpt-4 have like 1.4 trillion parameters? It dwarfs this by almost 5 times.

Edit: turns out, I was wrong! :D

−20

Buck-Nasty t1_j9hb9az wrote

Gpt4s parameter counts aren't known yet

26

Dankbubbles123 t1_j9hbbp1 wrote

Ah okay, nvm then. Sorry

13

Buck-Nasty t1_j9hch2c wrote

The context window is apparently massive though, more than 10 times the size of gpt3, it could potentially write whole novels at that scale

https://mobile.twitter.com/transitive_bs/status/1628118163874516992?s=46&t=Biiqy66Cy9oPH8c1BL6_JQ

17

hydraofwar t1_j9hgim5 wrote

A credible researcher had commented that ChatGPT can write code, and GPT-4 could write entire programs.

12

GPT-5entient t1_j9hk7td wrote

32k tokens would mean approximately 150 kB of text. That is a decent sized code base! Also with this much context memory the known context saving tricks would work much better so this could be theoretically used to create code bases of virtually unlimited size.

This amazes me and also (being software dev) also scares me...

But, as they say, what a time to be alive!

16

GPT-5entient t1_j9hji5i wrote

Wow, yeah, this looks amazing. My biggest issue with GPT-3 is the relatively small context window. This will open so many new possibilities.

7

Practical-Mix-4332 t1_j9hg2cr wrote

Is anything about gpt4 known? It seems like just a bunch of rumors and not even a release date

2

Midnight-Movie t1_j9hv0t7 wrote

>Is anything about gpt4 known? It seems like just a bunch of rumors and not even a release date

I work with someone who has Beta access to GPT-4. He won't tell me much other than it's mind-blowing & that software development will never be the same. He confirms the rumors that it indeed can write an entire piece of software.

6

farcetragedy t1_j9hzfq1 wrote

That’s exciting. Would be amazing if the next one didn’t just make shit up when it doesn’t know the answer

3

Practical-Mix-4332 t1_j9hxkf3 wrote

Oh great another rumor

1

Midnight-Movie t1_j9hy4uz wrote

Well... You asked if anything was known. I gave you info from a coworker with beta access. My apologies if my info didn't come with a boquet of roses and a handwritten card.

7

Practical-Mix-4332 t1_j9i0ctk wrote

I understand you’re trying to help, but this being Reddit and all there’s no way we can trust what you are saying or take it officially as something “known”. No offense though.

6

GPT-5entient t1_j9hkupz wrote

There was that very popular but completely unfounded rumor about 100T param count. It was debunked by Sam Altman himself.

If you think about it for just 1 second you would realize that 100T param model would need at least 200 TB of VRAM or 2560 Nvidia A100s...

1

bass6c t1_j9hfy9w wrote

Chatgpt is based on gpt3 a 175 billion parameters model.

1