Submitted by Realistic-Bed2658 t3_zzhg84 in MachineLearning
Realistic-Bed2658 OP t1_j2by65z wrote
Reply to comment by miss_minutes in [D] GPU-enabled scikit-learn by Realistic-Bed2658
thanks for the links, but I disagree for the most part.
DBSCAN and LOF most likely would benefit. Even their own MLP model would inherently benefit from it ( I do believe somebody willing to train a neural network most likely would use PyTorch of TF).
Also, the fact that today non-DL ML is mainly CPU-based doesn’t mean that in 5 years from now this won’t change. Personal opinion here, though.
AerysSk t1_j2cgmo4 wrote
If you are looking for a GPU version of scikit-learn, I think Nvidia is making one, and they call it cuml. Note that, not all algorithms are implemented, and there will also be some missing functions as well.
However, a note about Apple and AMD GPU thing: they are on the rise, but not until a few years later that they will become usable. My lab has only Nvidia GPUs but we already have a lot of headache dealing with Nvidia drivers and libraries. At least for a few years, we do not see any plan switching to AMD or Apple.
JocialSusticeWarrior t1_j2ed75n wrote
unfortunate name "cuml"
Realistic-Bed2658 OP t1_j2cld4m wrote
Totally understandable. I only use nvidia at work too.
Thanks for the info about the nvidia package!
AmbitiousTour t1_j2c637s wrote
Gradient boosting hugely benefits from a gpu.
Viewing a single comment thread. View all comments