Viewing a single comment thread. View all comments

idkname999 t1_iyuilcz wrote

validation loss flattening doesn't necessary mean overfitting. It is only fitting when it starts to increase. Yes, I experienced instances where early stopping (or early drop of learning rate) lead to less optimal solution. Though, basically every solution will be a local minimum, so you shouldn't worry about that as much.

4