There is ALWAYS a need for smaller models. However, I don’t think “higher quality data” is what will affect that change. The fundamental building blocks we use need to change drastically to allow for more flexibility, higher data representation, and more robust pattern learning. High quality data is always good but enabling higher data representation and one-shot/zero-shot learning is better.
pppoopppdiapeee t1_ivf6xqd wrote
Reply to [D] Do you think there is a competitive future for smaller, locally trained/served models? by naequs
There is ALWAYS a need for smaller models. However, I don’t think “higher quality data” is what will affect that change. The fundamental building blocks we use need to change drastically to allow for more flexibility, higher data representation, and more robust pattern learning. High quality data is always good but enabling higher data representation and one-shot/zero-shot learning is better.