Viewing a single comment thread. View all comments

BadassGhost t1_j7pov2k wrote

2019 was GPT-2 which rocked the boat. 2020 was GPT-3 which sank the boat. Those were partially responsible for kicking off this whole scaling up of transformers

There was also LaMDA in 2021, and I'm sure many other big events in that period that I'm forgetting

5

p3opl3 t1_j7r3693 wrote

That's actually a fair point.. although those models had been invented way before 2019.. release date isn't development or discovery date right. It's like GPT4 ..that's already existed for well over 2 years now right..it's just not "ready" yet.

Stable Diffusion 3 is literally microsecond level response time now.. it's insane.

Honestly.. I think the big breakthroughs.. aren't going to be in AI..it's going to be in UK/UX and how people are going to bootstrap these models for building something actually useful.

0