Submitted by AutoModerator t3_110j0cp in MachineLearning
FrostedFlake212 t1_ja04vmw wrote
Reply to comment by [deleted] in [D] Simple Questions Thread by AutoModerator
Oh wow okay, that makes a lot of sense! So essentially “converging” means, in simpler terms, that the model comes to a conclusion. And what you’re saying is that the model comes to a conclusion too fast on its conditions, and these are good conditions but not the optimal ones.
[deleted] t1_ja083mb wrote
Yes, the model "thinks" the solution found is the best, but it is not. The model is getting confused because of some complex mathematical results that it gets along the way, and never gets to the optimal solution hence "non optimal solution".
Sometimes it goes even worse: not only it does not converge to the best solution (previous paragraph) but also diverges, i.e the error increases (value grows) instead of decrease. This is less common and maybe just means there are planning errors.
This is just a broad idea.
Viewing a single comment thread. View all comments