Submitted by Constant-Cranberry29 t3_ywu5zb in deeplearning
sqweeeeeeeeeeeeeeeps t1_iwnu8yt wrote
You are misinterpreting what “normalizing” is. It converts your data to fit a standard normal distribution. That means, you have positive and negative numbers centered around 0. This is optimal for most deep learning models. The interval [0,1] is not good because you want some weights to be negative as certain features negatively impact certain results.
Constant-Cranberry29 OP t1_iwnx4ze wrote
so what should I do for solving this problem?
sqweeeeeeeeeeeeeeeps t1_iwnx6pv wrote
What’s your problem? Normalized data is good.
Constant-Cranberry29 OP t1_iwnxdah wrote
I want reduce the shifting prediction if I not use the abs()
sqweeeeeeeeeeeeeeeps t1_iwnxlbc wrote
? Not sure. Train it longer, lower learning rate, are u using teacher forcing? I’m not very familiar with best LSTM practices.
Viewing a single comment thread. View all comments