This is the Pytorch implementation for "Graph Distillation with Eigenbasis Matching".
deeprobust==0.2.9
gdown==4.7.3
networkx==3.2.1
numpy==1.26.3
ogb==1.3.6
pandas==2.1.4
scikit-learn==1.3.2
scipy==1.11.4
torch==2.1.2
torch_geometric==2.4.0
torch-sparse==0.6.18
For Citeseer Pubmed and Squirrel, the code will directly download them.
For Reddit, Flickr, and Ogbn-arXiv, we use the datasets provided by GraphSAINT. They are available on Google Drive link (the links are provided by GraphSAINT team).
For twitch-gamer, you can access it at Twitch-Gamer.
Download the files and unzip them to data
at the root directory.
(1) Run preprocess.py to preprocess the dataset and conduct the spectral decomposition.
(2) Initialize node features of the synthetic graph by running feat_init.py.
(3) Distill the synthetic graph by running distill.py.
(4) you can evaluate the cross-architecture generalization performance of the synthetic graph on various GNNs (GCN, SGC, PPNP, ChebyNet, BernNet, GPR-GNN) by running test_other_arcs.py.
Welcome to kindly cite our work with:
@inproceedings{liugraph,
title={Graph Distillation with Eigenbasis Matching},
author={Liu, Yang and Bo, Deyu and Shi, Chuan},
booktitle={Forty-first International Conference on Machine Learning}
}