inquisitor49 t1_j4tgazw wrote on January 18, 2023 at 3:16 AM Reply to [D] Simple Questions Thread by AutoModerator In transformers, a positional embedding is added to a word embedding. Why does this not mess up the word embedding, such as changing the embedding to another word? Permalink 1
inquisitor49 t1_j4tgazw wrote
Reply to [D] Simple Questions Thread by AutoModerator
In transformers, a positional embedding is added to a word embedding. Why does this not mess up the word embedding, such as changing the embedding to another word?