Skip to content

Comparison of RoPE and xPos positional embeddings used in LLMs

License

Notifications You must be signed in to change notification settings

jploski/RotaryEmbedding

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RotaryEmbedding

Side-by-side comparison of the RoPE and xPos positional embedding algorithms used in LLMs.

This repository contains a self-contained Python implementation of each positional embedding. The xPos embedding was invented as an enhancement to RoPE, addressing problems with extrapolation and undesirable cyclical oscillations of attention scores in the latter with increasing token distance.

The (currently most common?) implementation of RoPE (neox.py) is borrowed from Falcon LLM (which in turn borrowed it from GPT-NeoX). The implementation of xPos (xpos.py) was derived by changing rope.py based on the code included in syncdoth/RetNet (which in turn borrowed it from torchscale). Finally, the original implementation of RoPE (rope.py) was obtained by removing the scaling factor from the xPos implementation.

Links to the original papers:

About

Comparison of RoPE and xPos positional embeddings used in LLMs

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages