Viewing a single comment thread. View all comments

ReginaldIII t1_j5zvqal wrote

Okay, you're picky :p

Try deploying a model for realtime online learning of streaming sensor data that needs to runs on battery power and then insist it needs to run on GPUs.

Plenty of legitimate use cases for non GPU ML.

7

ML4Bratwurst t1_j5zxikl wrote

Can you give me one example of this? And even if. My point is still valid because I did not say that you should delete the CPU support lol

−1

ReginaldIII t1_j5zzhj1 wrote

Pick the tools that work for the problems you have. If you are online training a model on an embedded device you need something optimized for that hardware.

I gave you a generic example of a problem domain where this applies. You can search for online training on embedded devices if you are interested but I can't talk about specific applications because they are not public.

All I'm saying is drawing a line in the sand and saying you'd never use X if it doesn't have Y is silly because what if you end up working on something in the future where the constraints are different?

5