SnooDogs3089
SnooDogs3089 t1_j0g3jim wrote
As always it depends. They are all tools based on some stats theory that in a spectrum from "perfect case scenario" to "absolutely not feasible" give answers to questions. The choice to use something instead of something else is in the hands of the expert given a great bunch of consideration and educated guesses about the outcome ot the proj. Tools are tools. A craftsman knows when is appropriate an hammer and when is appropriate a precision driller. Same for us.
SnooDogs3089 t1_j0bn5w3 wrote
Reply to comment by xylont in [D] What would happen if you normalize each sample on its on before sending it to the neural net? by xylont
Because anyway the NN will "undo" or "if needed" do it. Assuming a reasonable big enough NN. Batch norm makes a lot of sense but inside the layers. I don't know how are you planning to use your NN but normalizing requires that the deployed model will need an additional preprocessing that in most cases it's not necessary and will only require resources plus the possibility of errors for end users. Moreover you have to be very careful about the future normalization. To synthesize, at best is only slightly useful and worst case scenario very dangerous.
SnooDogs3089 t1_j08qrh1 wrote
Reply to [D] What would happen if you normalize each sample on its on before sending it to the neural net? by xylont
Batch norm is the way. Do not touch anything before feeding
SnooDogs3089 t1_j9324td wrote
Reply to [D] Please stop by [deleted]
"No one with a working brain will design an AI that is self aware" if you can name one person living in this world capable of designing the Saint Graal of AI research please let me know. Anyway I agree...if this is the level of DS around the world my job is safe for the next 20 years