Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
-
Updated
Jul 27, 2021 - Python
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Papers about graph transformers.
Recipe for a General, Powerful, Scalable Graph Transformer
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
The official implementation for ICLR23 spotlight paper "DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained Diffusion"
The official implementation of NeurIPS22 spotlight paper "NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification"
Official Pytorch code for Structure-Aware Transformer.
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
[AAAI2023] A PyTorch implementation of PDFormer: Propagation Delay-aware Dynamic Long-range Transformer for Traffic Flow Prediction.
[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)
Deep learning toolkit for Drug Design with Pareto-based Multi-Objective optimization in Polypharmacology
Code for AAAI2020 paper "Graph Transformer for Graph-to-Sequence Learning"
Long Range Graph Benchmark, NeurIPS 2022 Track on D&B
Official Code Repository for the paper "Accurate Learning of Graph Representations with Graph Multiset Pooling" (ICLR 2021)
SignNet and BasisNet
Code for our paper "Attending to Graph Transformers"
It is a comprehensive resource hub compiling all graph papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
[ICDE'2023] When Spatio-Temporal Meet Wavelets: Disentangled Traffic Forecasting via Efficient Spectral Graph Attention Networks
Repository for CARTE: Context-Aware Representation of Table Entries
[SIGIR'2023] "GFormer: Graph Transformer for Recommendation"
Add a description, image, and links to the graph-transformer topic page so that developers can more easily learn about it.
To associate your repository with the graph-transformer topic, visit your repo's landing page and select "manage topics."