Viewing a single comment thread. View all comments

vjb_reddit_scrap t1_ivymo0p wrote

IIRC Hinton et al had a paper about initializing RNNs with identity and it solved many problems that LSTM solves.

3

DrXaos t1_iw04agd wrote

That’s a different scenario and clearly dynamically justified.

Any recursive neural network is like a nonlinear dynamical system. Learning happens best on the boundary of dissipation vs chaos (exploding or vanishing gradients).

The additive incorporation of new info in LSTM/GRU greatly ameliorates that usual problem of RNNs with random transition matrices where perturbations evolve multiplicatively. RNN initted to zero Lyapunov exponent through identity is helpful.

1