PrivateUser010
PrivateUser010 t1_j8qrk0j wrote
I see no problem with this. What could possibly go wrong? lol
PrivateUser010 t1_j8e2gry wrote
Reply to comment by Borrowedshorts in Anthropic's Jack Clark on AI progress by Impressive-Injury-91
Yes. I don't believe AI exhibits this either.
PrivateUser010 t1_j8e0iai wrote
Reply to comment by Villad_rock in Anthropic's Jack Clark on AI progress by Impressive-Injury-91
Google invented Transformers. It was released in 2015 I think. It was the first model focusing on massive attention mechanisms.
PrivateUser010 t1_j8dls02 wrote
So compounding exponential is just exponential I think. But I think Jack Clark may have imagined something like E^(E^x))
PrivateUser010 t1_j8dld1f wrote
Reply to comment by genericrich in Anthropic's Jack Clark on AI progress by Impressive-Injury-91
We have to pay homage to algorithmic improvement too. Neural Network Models like Transformers, Pretrained Transformers, Generative Adversarial Networks were all introduced in 2010-2020 decade and without those models, current changes would not be possible. So data, yes, processing power, yes, but models too.
PrivateUser010 t1_j9ynpg1 wrote
Reply to comment by kindred_asura in New SOTA LLM called LLaMA releases today by Meta AI 🫡 by Pro_RazE
It's not just the cost of training. It's the availability of quality data. Meta/Google/Facebook/Microsoft are all on the forefront of this due to the access to data.