Lionfyst

Lionfyst t1_j8x2xhd wrote

There will be a boom and there will be a bust, but some interesting stuff and new players will thrive and some old ones die.

Generative AI really feels like it fits a generational wave pattern, of explosion, boom, shake-out, transformation of the industry and culture.

  • ~1980 PC's
  • ~1990 the GUI
  • ~2000 the Web
  • ~2010 Smart Phones / Social Media
  • ~2020 Generative AI

Edit: Tried to add 10's

2

Lionfyst t1_j8i1477 wrote

A recent paper (around Reddit somewhere) demonstrated that LLM's can do all these novel things like tell stories, or make poems, or do math or make charts despite a lack of implicit design, because the massive training organically creates all kind of sub-models in their network that can handle those types of patterns.

ChatGPT is bad at math because it's training was insufficient to give it a model that is reliable.

It's not going to be too long before someone feeds a LLM with better math training, and/or creates a hybrid that uses some other kind of technique for the math part and hands off math questions to the other engine.

41