Viewing a single comment thread. View all comments

sqweeeeeeeeeeeeeeeps t1_iwnu8yt wrote

You are misinterpreting what “normalizing” is. It converts your data to fit a standard normal distribution. That means, you have positive and negative numbers centered around 0. This is optimal for most deep learning models. The interval [0,1] is not good because you want some weights to be negative as certain features negatively impact certain results.

1

Constant-Cranberry29 OP t1_iwnx4ze wrote

so what should I do for solving this problem?

1

sqweeeeeeeeeeeeeeeps t1_iwnx6pv wrote

What’s your problem? Normalized data is good.

1

Constant-Cranberry29 OP t1_iwnxdah wrote

I want reduce the shifting prediction if I not use the abs()

1

sqweeeeeeeeeeeeeeeps t1_iwnxlbc wrote

? Not sure. Train it longer, lower learning rate, are u using teacher forcing? I’m not very familiar with best LSTM practices.

1