Rohan Singh's Weblog

Writing about software, data science, and things I learn along the way.

Transformer Positional Encoding

machine-learning-stats

Transformer Positional Encoding

Transformers do not have recurrence like RNN or convolution such as CNN which could help in learning from the order of sequences. To solve this, postitional encoding is used to preserve the ordering knowledge.

← Back to TIL