Actions: fla-org/flash-linear-attention
Actions
51 workflow runs
51 workflow runs
Mamba2
] Fix slow path
pr
#39:
Pull request #84
synchronize
by
vasqu
Mamba2
] Fix slow path
pr
#38:
Pull request #84
opened
by
vasqu
max_seqlen
when max_position_embeddings
is None
pr
#30:
Pull request #59
opened
by
zhixuan-lin
Mamba2
] Post Merge Fixes - norm_before_gate
and generation with inputs_embeds
pr
#29:
Pull request #57
opened
by
vasqu
__init__.py
in fla/ops/common
for automatic package discovery
pr
#28:
Pull request #56
reopened
by
yzhangcs
__init__.py
in fla/ops/common
for automatic package discovery
pr
#27:
Pull request #56
opened
by
zhixuan-lin