[D] Does it make sense to use dropout and layer normalization in the same model? Submitted by Beneficial_Law_5613 t3_zzqzoy on December 31, 2022 at 10:19 AM in MachineLearning 4 comments 2
hannahmontana1814 t1_j2dwds2 wrote on December 31, 2022 at 3:17 PM Yes, it makes sense to use dropout and layer normalization in the same model. But only if you want your model to be overfitting and perform worse than it could. Permalink −5−
Viewing a single comment thread. View all comments