Skip to content

Unofficial Pytorch Implementation of EMNLP 2019<Neural News Recommendation with Multi-Head Self-Attention>

Notifications You must be signed in to change notification settings

aqweteddy/NRMS-Pytorch

Repository files navigation

NRMS

Change

  • Use Range instead of Adam Optimizer
  • title encoder: branch for RoBERTa and ELETRA-Small
  • pytorch-lightning
    • tensorboard support
    • early stop

Benchmark

  • Use Taiwan PTT forum as data (Tranditional Chinese)
  • regard a comment as the user intereseted in the post
  • train on one Titan RTX
  • train until early stop

Data Description

  • articles.json
[{'id': 0, 'title': ['[', '公告', '] ', '八卦', '優文', '推薦', '申請']},
 {'id': 1, 'title': ['[', '公告', '] ', '八卦板', '政治文', '規範', '草案', '開始', '討論']},
 {'id': 2, 'title': ['[', '公告', '] ', '三月份', '置底', '閒聊', '文']},
 ...
 ]
  • users_list.json
[{'user_id': 0, 'push':[1, 2, 3]}, # 'push' is a list of articles.json id
{'user_id': 1, 'push':[2, 5, 6]},
...
]

Model

  • original: Use Word2Vec pretrained on Wiki-zh
  • Roberta: roberta-base in this
  • ELETRA: electra-smiall in this

training time

  • original(Adam): 1 hr
  • original(Ranger): 1 hr 4 min
  • Roberta(Ranger): 19 hr 46 min
  • ELETRA-Small(Ranger): 2hr 19min

Score on ValidationSet

AUROC

  • original(Adam): 0.86
  • original(Ranger): 0.89
  • Roberta(Ranger): 0.94
  • ELETRA-small(Ranger): 0.91

ndcg@5

  • original(Adam): 0.73
  • original(Ranger): 0.79
  • Roberta(Ranger): 0.88
  • ELETRA-small(Ranger): 0.81

ndcg@10

  • original(Adam): 0.67
  • original(Ranger): 0.72
  • Roberta(Ranger): 0.81
  • ELETRA-small(Ranger): 0.74

About

Unofficial Pytorch Implementation of EMNLP 2019<Neural News Recommendation with Multi-Head Self-Attention>

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages