neato5000
neato5000 t1_ir0rkr8 wrote
Reply to [D] How do you go about hyperparameter tuning when network takes a long time to train? by twocupv60
You do not need to train to completion to be able to discard hyperparameter settings that will not perform well. In general early relative performance is a good predictor of final performance, so if within the early stages of training a certain hp vector is performing worse than its peers kill it, and start training with the next hp vector.
This is roughly the logic behind population based training
neato5000 t1_jccszzz wrote
Reply to [D] To those of you who quit machine learning, what do you do now? by nopainnogain5
I've had jobs that were similar to what you describe. My current job contains less by way of tiny tweaks to massive DL models and more feature engineering and engineering in general which suits me better.
My slightly warm take is that DL at the coal face in industry feels very random, very time consuming, and as a result a bit demoralising. More power to you if you have the knack for it, and enjoy it, it's just not super my bag.