Repulsive_Tart3669 t1_j4qqivs wrote
Reply to comment by BenoitParis in [D] Is it possible to update random forest parameters with new data instead of retraining on all data? by monkeysingmonkeynew
This should be considered in the first place. For instance, gradient boosting trees that are mostly implemented in C/C++ and have GPU compute backends - XGBoost, CatBoost and LightGBM. Given daily updates, you'll have enough time not only to train a model, but also optimize its hyperparameters. In my experience, XGBoost + RayTune work just fine.
Viewing a single comment thread. View all comments