Submitted by Imaginary_Carrot4092 t3_xudng9 in MachineLearning
I am trying to learn a certain data with Neural Networks and the loss decreases very steeply in the first 2 epochs and almost remains constant after that. I tried manipulating the hyperparameters but the loss pattern never changed. My dataset it quite large so there is no scarcity of data.
How do I identify the problem here and how can I conclude that my data cannot be learned at all ?
​
PassionatePossum t1_iqv5fo9 wrote
Are we talking about training loss or validation loss? Because the training loss will almost always go gown and it means very little.