Submitted by thanderrine t3_zc0kco in MachineLearning
idkname999 t1_iyuilcz wrote
validation loss flattening doesn't necessary mean overfitting. It is only fitting when it starts to increase. Yes, I experienced instances where early stopping (or early drop of learning rate) lead to less optimal solution. Though, basically every solution will be a local minimum, so you shouldn't worry about that as much.
Viewing a single comment thread. View all comments