Submitted by redditnit21 t3_y5qn9h in deeplearning
danielgafni t1_it97yro wrote
Reply to comment by redditnit21 in Testing Accuracy higher than Training Accuracy by redditnit21
Don’t remove it, it’s just how it works. There is nothing wrong with having a higher train loss if you are using dropout.
Viewing a single comment thread. View all comments