EEDN (SIGIR'23) [/paper]
The paper can be found in [/paper] or [ACM SIGIR](https://dl.acm.org/doi/10.1145/3539618.3591678).
python Main.py
- Configures are given by Constants.py and Main.py
- As mentioned in the paper, EEDN requires a shallow and wide architecture, please DO NOT over limit the embedding size for comparisons, unless there are not enough GPU memories.
- When you apply EEDN on other datasets, as
and are sensitive, please tune these two hyperparameters by Optuna at least 100 times, which HAS BEEN IMPLEMENTED by the given code in Main.py (Line.160) - If you have any problem, please feel free to contact me at kaysenn@163.com.
- Python 3.7.6
- PyTorch version 1.7.1.
Three files are required: train.txt (for training), tune.txt (for tuning), and test.txt (for testing).
Each line denotes an interaction including a user visited a POI at times.
The format is [#USER_ID]\t[#POI_ID]\t[#TIMES]\n, which is the same for all files.
For example,
0 0 1
0 1 3
0 3 2
1 2 1
the user (ID=0) visited the POI (ID=0) at 1 time,
the POI (ID=1) at 3 times,
and the POI (ID=3) at 2 times.
the user (ID=1) visited the POI (ID=2) at 1 time.
Dataset | #Users | #Items | lambda | delta |
Douban-book | 12,859 | 22,294 | 0.5 | 1 |
Gowalla | 18,737 | 32,510 | 1.5 | 4 |
Foursquare | 7,642 | 28,483 | 0.4 | 0.7 |
Yelp challenge round 7 | 30,887 | 18,995 | 1 | 2.4 |
Yelp2018 | 31,668 | 38,048 | 1 | 4 |
- SimGCL SIGIR'2022
- NCL WWW'2022
- DirectAU KDD'2022
- STaTRL APIN'2022
- SGL SIGIR'2021
- SEPT KDD'2021
- LightGCN SIGIR'2020
- CPIR IJCAI'2020
- ENMF TOIS'2020
- SAE-NAD CIKM'2018
If this repository helps you, please cite:
@inproceedings{wang2023eedn,
title={EEDN: Enhanced Encoder-Decoder Network with Local and Global Context Learning for POI Recommendation},
author={Wang, Xinfeng and Fukumoto, Fumiyo and Cui, Jin and Suzuki, Yoshimi and Li, Jiyi and Yu, Dongjin},
booktitle={Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval},
pages={383--392},
year={2023}
}
Thanks to Coder-Yu who collected many available baselines, and kindly released them.