SnooDogs3089 t1_j08qrh1 wrote
Batch norm is the way. Do not touch anything before feeding
xylont OP t1_j0aalgu wrote
Why not?
SnooDogs3089 t1_j0bn5w3 wrote
Because anyway the NN will "undo" or "if needed" do it. Assuming a reasonable big enough NN. Batch norm makes a lot of sense but inside the layers. I don't know how are you planning to use your NN but normalizing requires that the deployed model will need an additional preprocessing that in most cases it's not necessary and will only require resources plus the possibility of errors for end users. Moreover you have to be very careful about the future normalization. To synthesize, at best is only slightly useful and worst case scenario very dangerous.
Viewing a single comment thread. View all comments