Viewing a single comment thread. View all comments

Ace_Snowlight OP t1_j0tsspm wrote

Not trying to prove anything, just sharing, look at this: https://www.adept.ai/act

I can't wait to get my hands on this! Isn't it cool? ✨

10

GuyWithLag t1_j0ttml7 wrote

Still, to have AGI you need to have working memory; right now for all transformer-based models, the working memory is their input and output. Adding it is... non-trivial.

12

__ingeniare__ t1_j0tvtay wrote

I wouldn't call ACT-1 AGI, but it looks revolutionary nonetheless. If what they show in those videos is legit, it will be a game changer.

8

red75prime t1_j0v9jzt wrote

Given the current state of LLMs, I expect it to fail 10-30% of requests.

3

Ace_Snowlight OP t1_j0vaeji wrote

Even if you are right, that percentage will most likely go down really fast unexpectedly soon.

And even so it will still be a huge deal, every failure just boosts the next success (at least in this context).

3

red75prime t1_j0vpb7l wrote

They haven't provided any information on their online learning method. If it utilizes transformer in-context learning (the simplest thing you can do to boost performance), the results will not be especially spectacular or revolutionary. We'll see.

3