Skip to content

Releases: lucidrains/sinkhorn-transformer

0.11.4

10 Aug 02:49
Compare
Choose a tag to compare
fix autopadder when mask is not given

0.11.3

18 Mar 03:39
76de0f4
Compare
Choose a tag to compare
add local attention as dep

0.11.2

02 Jan 01:30
Compare
Choose a tag to compare
update gcd to be from math package, for python 3.9 compat

0.11.1

04 Nov 20:15
Compare
Choose a tag to compare
fix bug for post attention layer norms when using emb factorization

0.11.0

29 Oct 19:57
Compare
Choose a tag to compare
normalize after attention layers

0.10.3

17 Sep 02:09
Compare
Choose a tag to compare
Update setup.py

0.10.2

17 Sep 02:08
Compare
Choose a tag to compare
add embedding dropout

0.10.1

05 Aug 19:56
Compare
Choose a tag to compare
bump for release

0.10.0

14 Jul 01:57
Compare
Choose a tag to compare
bump package

0.9.1

05 Jul 20:17
Compare
Choose a tag to compare
move local attention to separate repository