If you find my projects useful, please consider becoming a sponsor. Everything here comes from my free time, and is released under permissive licenses (e.g. MIT). Your contribution helps fund open-source AI.
- Huntsville, AL
- in/frank-odom
- https://fkodom.substack.com
Highlights
- Pro
Pinned Loading
-
fft-conv-pytorch
fft-conv-pytorch PublicImplementation of 1D, 2D, and 3D FFT convolutions in PyTorch. Much faster than direct convolutions for large kernel sizes.
-
yet-another-retnet
yet-another-retnet PublicA simple but robust PyTorch implementation of RetNet from "Retentive Network: A Successor to Transformer for Large Language Models" (https://arxiv.org/pdf/2307.08621.pdf)
-
grouped-query-attention-pytorch
grouped-query-attention-pytorch Public(Unofficial) PyTorch implementation of grouped-query attention (GQA) from "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints" (https://arxiv.org/pdf/2305.13245.pdf)
-
transformer-from-scratch
transformer-from-scratch PublicCode implementation from my blog post: https://fkodom.substack.com/p/transformers-from-scratch-in-pytorch
-
clip-text-decoder
clip-text-decoder PublicGenerate text captions for images from their embeddings.
-
soft-mixture-of-experts
soft-mixture-of-experts PublicPyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
If the problem persists, check the GitHub status page or contact support.