It's a property of the field in general - there is very little theory to guide neural architecture design, just some heuristics backed by trial-and-error experimentation. Deep learning models are fun, but in practice you spend a lot of your time trying to trick gradient descent into converging faster.
pwsiegel t1_jcbjf82 wrote
Reply to comment by SnooPears7079 in [D] To those of you who quit machine learning, what do you do now? by nopainnogain5
It's a property of the field in general - there is very little theory to guide neural architecture design, just some heuristics backed by trial-and-error experimentation. Deep learning models are fun, but in practice you spend a lot of your time trying to trick gradient descent into converging faster.