Naive_Piglet_III
Naive_Piglet_III t1_ivf7an0 wrote
Reply to comment by naequs in [D] Do you think there is a competitive future for smaller, locally trained/served models? by naequs
This is where I believe the human component of AI/ ML lies in the future. Being able to discern use-cases where simple models will work vs where complex algorithms will add value.
If you look at how businesses approach AI/ML today, everyone wants to have a cloud based platform that’s integrated to a massive data lake capable of running deep learning / reinforced learning algorithms. But the reality is, majority of business problems (specifically in non-tech businesses like retail, e-commerce, financial services etc..) don’t require such complex things.
My heart weeps when organisations try to implement a deep learning model for a simple fraud detection use case which could well be achieved by a logistic regression model using much smaller amount of data. What’s worse, they’d spend probably millions of dollars in trying to develop and operationalise the solution.
The problem however is that hype merchants (read consulting companies) make it sound like this is the only way that companies can stay competitive in the future. AI/ML conferences also don’t help in that they almost always only want to showcase an insanely complicated algorithm utilising a massive tech-stack. I feel, there are very few people in the industry too, who advocate for simplification.
But eventually, I expect the hype to die and companies to realise that this doesn’t give them any incremental benefit in every use case.
Having said all that, the specific example that you’ve given like language and image processing, I also expect the large / deep models to become the norm because these models are also offered as a service like GitHub copilot. And it might actually be cheaper to use them directly than develop a small-scale customised model.
Naive_Piglet_III t1_ivevmim wrote
Reply to [D] Do you think there is a competitive future for smaller, locally trained/served models? by naequs
Could you provide a bit more context? Are you referring to specifically the case of language processing and such use cases? Or are you referring to general ML use cases?
Naive_Piglet_III t1_j32tqpo wrote
Reply to comment by masklinn in Where does all of the snot come from when you have a cold? by Dunkachin0
I remember talking to a medical student who explained that it also acts as a first line of defence. It has the ability to bind potential parasites in the nasal region where they potentially get killed by specific immune cells or are expelled via cough / sneeze.
Some virus (most common cold) have also evolved to survive this which is why you catch a cold if someone sneezes on you.