Ford_O
Ford_O t1_iwtxi5i wrote
Reply to [R] RWKV-4 7B release: an attention-free RNN language model matching GPT-J performance (14B training in progress) by bo_peng
How much smaller are the embeddings?
Ford_O t1_iwtx6nb wrote
Reply to comment by bo_peng in [R] RWKV-4 7B release: an attention-free RNN language model matching GPT-J performance (14B training in progress) by bo_peng
Could you also measure the performance on CPU?
Ford_O t1_iwtrw98 wrote
Reply to [R] RWKV-4 7B release: an attention-free RNN language model matching GPT-J performance (14B training in progress) by bo_peng
How much faster is RNN on inference than GPTJ?
Ford_O t1_iz2eau3 wrote
Reply to [R] The Forward-Forward Algorithm: Some Preliminary Investigations [Geoffrey Hinton] by shitboots
So that's why I keep getting nightmares.
Jokes aside, this sounds quite plausible. However, I am unsure if this can be ever more efficient than backprop. Yet, this could have huge impact on neuroscience, if it turns that's what happens in sleep.