Viewing a single comment thread. View all comments

DaLameLama t1_j4zhqqj wrote

Does ChatGPT actually get past the token limit? Codex supports ~8000 tokens. You might underestimate how much this is. Has anyone tested the limits?

Unfortunately, OpenAI aren't serious about publishing technical reports anymore.

30

andreichiffa t1_j50x4ky wrote

Reported token size is 2048, but they likely do a hard attention mask. In about 1/4th of words

9

EmmyNoetherRing t1_j510553 wrote

>Unfortunately, OpenAI aren't serious about publishing technical reports anymore.

Do OpenAI folks show up to any of the major research conferences? These days I mostly come into contact with AI when it wanders into the tech policy/governance world, and this seems like the sort of work that would get you invited to an OSTP workshop, but I'm not sure if that's actually happening.

OpenAI's latest not-so-technical report (on their website) has a few folks from Georgetown contributing to it, and since AAAI is in DC in a few weeks I was hoping OpenAI would be around and available for questions in some capacity, in some room at the conference.

5

DaLameLama t1_j519tns wrote

There was an OpenAI party at NeurIPS, but I wasn't there. No clue about AAAI :)

4

EmmyNoetherRing t1_j51cvjh wrote

yeah, as an uninformed guess it seems like IJCAI or NeurIPS would be a more natural home, but AAAI is actually in DC, which seems helpful for some categories of conversation. if the right people attend.

3

EmmyNoetherRing t1_j50zesm wrote

I've heard a diverse variety of folks talk about leaving chatGPT tabs/sessions open for for days or weeks and maintaining context plausibly well throughout.

3