Submitted by Realistic-Bed2658 t3_zzhg84 in MachineLearning
jeanfeydy t1_j2cnd36 wrote
Reply to comment by miss_minutes in [D] GPU-enabled scikit-learn by Realistic-Bed2658
From direct discussions with the sklearn team, note that this may change relatively soon: a GPU engineer funded by Intel was recently added to the core development team. Last time I met with the team in person (6 months ago), the project was to factor some of the most GPU friendly computations out of the sklearn code base, such as K-Nearest Neighbor search or kernel-related computations, and to document an internal API to let external developers easily develop accelerated backends. As shown by e.g. our KeOps library, GPUs are extremely well suited to classical ML and sklearn is the perfect platform to let users fully take advantage of their hardware. Let’s hope that OP’s question will become redundant at some point in 2023-24 :-)
Viewing a single comment thread. View all comments