CubeFlipper t1_j9s5v9g wrote
Reply to comment by LettucePrime in Question for any AI enthusiasts about an obvious (?) solution to a difficult LLM problem in society by LettucePrime
>I know that the computing power necessary for the most successful models far outstrip what your average consumer is capable of generating.
And once upon a time a useful computer would never fit in an average person's home. Ignoring all the other ways your store -everything idea wouldn't be effective, the cost of compute and efficiency of these models is changing so fast that by the time your idea was implemented, it would already be obsolete.
LettucePrime OP t1_j9sine9 wrote
Oh no that seems a bit silly to me. The last 15 years were literally about our global "store-everything" infrastructure. If we're betting on a race between web devs encoding tiny text files & computer engineers attempting to rescale a language model of unprecedented size to hardware so efficient it's more cost effective to run on-site than access remotely, I'm putting money on the web devs lmao
Viewing a single comment thread. View all comments