Transformer Positional Encoding
15th January 2026
Transformer Positional Encoding
Transformers do not have recurrence like RNN or convolution such as CNN which could help in learning from the order of sequences. To solve this, postitional encoding is used to preserve the ordering knowledge.