xylont OP t1_j0aalgu wrote
Reply to comment by SnooDogs3089 in [D] What would happen if you normalize each sample on its on before sending it to the neural net? by xylont
Why not?
SnooDogs3089 t1_j0bn5w3 wrote
Because anyway the NN will "undo" or "if needed" do it. Assuming a reasonable big enough NN. Batch norm makes a lot of sense but inside the layers. I don't know how are you planning to use your NN but normalizing requires that the deployed model will need an additional preprocessing that in most cases it's not necessary and will only require resources plus the possibility of errors for end users. Moreover you have to be very careful about the future normalization. To synthesize, at best is only slightly useful and worst case scenario very dangerous.
Viewing a single comment thread. View all comments