Submitted by AutoModerator t3_110j0cp in MachineLearning
FrostedFlake212 t1_j9xc55e wrote
What does it meant by this statement: “GM (Gaussian mixture) on its own is not much of use because it converges too fast to a non-optimal solution”
[deleted] t1_j9zrouh wrote
I dont know what GM is, but for the second part, you can imagine that training a model is like finding a set of optimal conditions. However, some models find good conditions but not the best ones, this normally depends in the Loss function and other characteristics.
FrostedFlake212 t1_ja04vmw wrote
Oh wow okay, that makes a lot of sense! So essentially “converging” means, in simpler terms, that the model comes to a conclusion. And what you’re saying is that the model comes to a conclusion too fast on its conditions, and these are good conditions but not the optimal ones.
[deleted] t1_ja083mb wrote
Yes, the model "thinks" the solution found is the best, but it is not. The model is getting confused because of some complex mathematical results that it gets along the way, and never gets to the optimal solution hence "non optimal solution".
Sometimes it goes even worse: not only it does not converge to the best solution (previous paragraph) but also diverges, i.e the error increases (value grows) instead of decrease. This is less common and maybe just means there are planning errors.
This is just a broad idea.
Viewing a single comment thread. View all comments