Submitted by Realistic-Bed2658 t3_zzhg84 in MachineLearning
Hi everyone. I was trying to find a possible gpu-enabled version of scikit-learn, but I was surprised to see that most of the libraries are written for NVIDIA GPUs. While nvidia GPUs are very common, I do feel that Apple’s gpu availability and the popularity of AMD demand a more thorough coverage.
I was wondering whether coding scikit-learn on top of pytorch with GPU support could be something the community would be interested in.
I do not work for Meta. I would just like to do something useful for the community.
Cheers.
miss_minutes t1_j2bph94 wrote
https://stackoverflow.com/a/41568439
https://scikit-learn.org/stable/faq.html#will-you-add-gpu-support
Scikit learn isn't meant to be used with the GPU. Machine learning that's not deep learning don't benefit as much from GPU compute. Sklearn also uses optimised native libraries when possible (e.g. libsvm and liblinear), so if you were to implement the sklearn API on top of pytorch, it's very unlikely you'll beat the performance of sklearn.