Implementation of the paper [Using Fast Weights to Attend to the Recent Past](https://arxiv.org/abs/1610.06258)
-
Updated
Nov 3, 2016 - Python
Implementation of the paper [Using Fast Weights to Attend to the Recent Past](https://arxiv.org/abs/1610.06258)
Official repository for the paper "A Modern Self-Referential Weight Matrix That Learns to Modify Itself" (ICML 2022 & NeurIPS 2021 Deep RL Workshop) and "Accelerating Neural Self-Improvement via Bootstrapping" (ICLR 2023 Workshop)
Official repository for the paper "Going Beyond Linear Transformers with Recurrent Fast Weight Programmers" (NeurIPS 2021)
Official repository for the paper "Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules" (NeurIPS 2022)
PyTorch Language Modeling Toolkit for Fast Weight Programmers
Official repository for the paper "Images as Weight Matrices: Sequential Image Generation Through Synaptic Learning Rules" (ICLR 2023)
Official repository for the paper "Automating Continual Learning"
PyTorch Implementation of the paper [Using Fast Weights to Attend to the Recent Past]
Official repository for the paper "Practical Computational Power of Linear Transformers and Their Recurrent and Self-Referential Extensions" (EMNLP 2023)
PyTorch implementation of DCT fast weight RNNs
Add a description, image, and links to the fast-weights topic page so that developers can more easily learn about it.
To associate your repository with the fast-weights topic, visit your repo's landing page and select "manage topics."