Skip to content

Commit

Permalink
Merge pull request #1420 from zhengbw0324/1.1.x
Browse files Browse the repository at this point in the history
FEA: Add the implementation of knowledge-aware model KGIN and MCCLK.
  • Loading branch information
Sherry-XLL authored Sep 11, 2022
2 parents 70abcdc + 9adac54 commit b7717f8
Show file tree
Hide file tree
Showing 13 changed files with 1,323 additions and 2 deletions.
1 change: 1 addition & 0 deletions .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ jobs:
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
conda list
conda install -c conda-forge faiss-cpu
pip install torch-scatter -f https://data.pyg.org/whl/torch-`python -c "import torch;print(torch.__version__)"`.html
pip install setuptools==59.5.0
pip install plotly
# Use "python -m pytest" instead of "pytest" to fix imports
Expand Down
Binary file added docs/source/asset/kgin.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/asset/mcclk.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Introduction
RecBole is a unified, comprehensive and efficient framework developed based on PyTorch.
It aims to help the researchers to reproduce and develop recommendation models.

In the lastest release, our library includes 81 recommendation algorithms `[Model List]`_, covering four major categories:
In the lastest release, our library includes 83 recommendation algorithms `[Model List]`_, covering four major categories:

- General Recommendation
- Sequential Recommendation
Expand Down
83 changes: 83 additions & 0 deletions docs/source/user_guide/model/knowledge/kgin.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
KGIN
===========

Introduction
---------------------

`[paper] <https://dl.acm.org/doi/abs/10.1145/3442381.3450133>`_

**Title:** Learning Intents behind Interactions with Knowledge Graph for Recommendation

**Authors:** Xiang Wang, Tinglin Huang, Dingxian Wang, Yancheng Yuan, Zhenguang Liu, Xiangnan He, Tat-Seng Chua

**Abstract:** Knowledge graph (KG) plays an increasingly important role in recommender systems. A recent technical trend is to develop end-to-end models founded on graph neural networks (GNNs). However, existing GNN-based models are coarse-grained in relational modeling, failing to (1) identify user-item relation at a fine-grained level of intents, and (2) exploit relation dependencies to preserve the semantics of long-range connectivity.

In this study, we explore intents behind a user-item interaction by using auxiliary item knowledge, and propose a new model, Knowledge Graph-based Intent Network (KGIN). Technically, we model each intent as an attentive combination of KG relations, encouraging the independence of different intents for better model capability and interpretability. Furthermore, we devise a new information aggregation scheme for GNN, which recursively integrates the relation sequences of long-range connectivity (i.e., relational paths). This scheme allows us to distill useful information about user intents and encode them into the representations of users and items. Experimental results on three benchmark datasets show that, KGIN achieves significant improvements over the state-of-the-art methods like KGAT, KGNN-LS, and CKAN. Further analyses show that KGIN offers interpretable explanations for predictions by identifying influential intents and relational paths.

.. image:: ../../../asset/kgin.png
:width: 500
:align: center

Running with RecBole
-------------------------

**Model Hyper-Parameters:**

- ``embedding_size (int)`` : The embedding size of users, items, entities and relations. Defaults to ``64``.
- ``reg_weight (float)`` : The L2 regularization weight. Defaults to ``1e-5``.
- ``node_dropout_rate (float)`` : The node dropout rate in GCN layer. Defaults to ``0.1``.
- ``mess_dropout_rate (float)`` : The message dropout rate in GCN layer. Defaults to ``0.1``.
- ``sim_regularity (float)`` : The intents independence loss weight. Defaults to ``1e-4``.
- ``context_hops (int)`` : The number of context hops in GCN layer. Defaults to ``3``.
- ``n_factors (int)`` : The number of user intents. Defaults to ``4``.
- ``ind (str)`` : The intents independence loss type. Defaults to ``'cosine'``. Range in ``['mi', 'distance', 'cosine']``.
- ``temperature (float)`` : The temperature parameter used in loss calculation. Defaults to ``0.2``.

**A Running Example:**

Write the following code to a python file, such as `run.py`

.. code:: python
from recbole.quick_start import run_recbole
run_recbole(model='KGIN', dataset='ml-100k')
And then:

.. code:: bash
python run.py
Tuning Hyper Parameters
-------------------------

If you want to use ``HyperTuning`` to tune hyper parameters of this model, you can copy the following settings and name it as ``hyper.test``.

.. code:: bash
learning_rate choice [1e-4,1e-3,5e-3]
node_dropout_rate choice [0.1,0.3,0.5]
mess_dropout_rate choice [0.0,0.1]
context_hops choice [2,3]
n_factors choice [4,8]
ind choice ['cosine','distance']
Note that we just provide these hyper parameter ranges for reference only, and we can not guarantee that they are the optimal range of this model.

Then, with the source code of RecBole (you can download it from GitHub), you can run the ``run_hyper.py`` to tuning:

.. code:: bash
python run_hyper.py --model=[model_name] --dataset=[dataset_name] --config_files=[config_files_path] --params_file=hyper.test
For more details about Parameter Tuning, refer to :doc:`../../../user_guide/usage/parameter_tuning`.


If you want to change parameters, dataset or evaluation settings, take a look at

- :doc:`../../../user_guide/config_settings`
- :doc:`../../../user_guide/data_intro`
- :doc:`../../../user_guide/train_eval_intro`
- :doc:`../../../user_guide/usage`

83 changes: 83 additions & 0 deletions docs/source/user_guide/model/knowledge/mcclk.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
MCCLK
===========

Introduction
---------------------

`[paper] <https://arxiv.org/abs/2204.08807>`_

**Title:** Multi-level Cross-view Contrastive Learning for Knowledge-aware Recommender System

**Authors:** Ding Zou, Wei Wei, Xian-Ling Mao, Ziyang Wang, Minghui Qiu, Feida Zhu, Xin Cao

**Abstract:** Knowledge graph (KG) plays an increasingly important role in recommender systems. Recently, graph neural networks (GNNs) based model has gradually become the theme of knowledge-aware recommendation (KGR). However, there is a natural deficiency for GNN-based KGR models, that is, the sparse supervised signal problem, which may make their actual performance drop to some extent. Inspired by the recent success of contrastive learning in mining supervised signals from data itself, in this paper, we focus on exploring the contrastive learning in KG-aware recommendation and propose a novel multi-level cross-view contrastive learning mechanism, named MCCLK. Different from traditional contrastive learning methods which generate two graph views by uniform data augmentation schemes such as corruption or dropping, we comprehensively consider three different graph views for KG-aware recommendation, including global-level structural view, local-level collaborative and semantic views. Specifically, we consider the user-item graph as a collaborative view, the item-entity graph as a semantic view, and the user-item-entity graph as a structural view. MCCLK hence performs contrastive learning across three views on both local and global levels, mining comprehensive graph feature and structure information in a self-supervised manner. Besides, in semantic view, a k-Nearest-Neighbor (kNN) item-item semantic graph construction module is proposed, to capture the important item-item semantic relation which is usually ignored by previous work. Extensive experiments conducted on three benchmark datasets show the superior performance of our proposed method over the state-of-the-arts.

.. image:: ../../../asset/mcclk.png
:width: 500
:align: center

Running with RecBole
-------------------------

**Model Hyper-Parameters:**

- ``embedding_size (int)`` : The embedding size of users, items, entities and relations. Defaults to ``64``.
- ``reg_weight (float)`` : The L2 regularization weight. Defaults to ``1e-5``.
- ``n_hops (int)`` : The number of context hops in GCN layer. Defaults to ``2``.
- ``node_dropout_rate (float)`` : The node dropout rate in GCN layer. Defaults to ``0.1``.
- ``mess_dropout_rate (float)`` : The message dropout rate in GCN layer. Defaults to ``0.1``.
- ``lightgcn_layer (int)`` : The number of LightGCN layer. Defaults to ``2``.
- ``item_agg_layer (int)`` : The number of item aggregation layer. Defaults to ``1``.
- ``alpha (float)`` : The local-level contrastive loss weight. Defaults to ``0.2``.
- ``beta (float)`` : The contrastive loss weight. Defaults to ``0.1``.
- ``k (int)`` : The topk parameter used in building a k-NN item-item semantic graph. Defaults to ``10``.
- ``lambda_coeff (float)`` : The coefficient when updating k-NN item-item semantic graph. Defaults to ``0.5``.
- ``temperature (float)`` : The temperature parameter used in loss calculation. Defaults to ``0.8``.
- ``build_graph_separately (bool)`` : Whether user a separate GCN to build item-item graph. Defaults to ``True``.
- ``loss_type (str)`` : The loss type used in this model. Defaults to ``'BPR'``. Range in ``['BPR', 'BCE']``.

**A Running Example:**

Write the following code to a python file, such as `run.py`

.. code:: python
from recbole.quick_start import run_recbole
run_recbole(model='MCCLK', dataset='ml-100k')
And then:

.. code:: bash
python run.py
Tuning Hyper Parameters
-------------------------

If you want to use ``HyperTuning`` to tune hyper parameters of this model, you can copy the following settings and name it as ``hyper.test``.

.. code:: bash
learning_rate choice [1e-4,1e-3,5e-3]
node_dropout_rate choice [0.1,0.3,0.5]
mess_dropout_rate choice [0.0,0.1]
build_graph_separately choice [True, False]
Note that we just provide these hyper parameter ranges for reference only, and we can not guarantee that they are the optimal range of this model.

Then, with the source code of RecBole (you can download it from GitHub), you can run the ``run_hyper.py`` to tuning:

.. code:: bash
python run_hyper.py --model=[model_name] --dataset=[dataset_name] --config_files=[config_files_path] --params_file=hyper.test
For more details about Parameter Tuning, refer to :doc:`../../../user_guide/usage/parameter_tuning`.


If you want to change parameters, dataset or evaluation settings, take a look at

- :doc:`../../../user_guide/config_settings`
- :doc:`../../../user_guide/data_intro`
- :doc:`../../../user_guide/train_eval_intro`
- :doc:`../../../user_guide/usage`
4 changes: 3 additions & 1 deletion docs/source/user_guide/model_intro.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Model Introduction
=====================
We implement 81 recommendation models covering general recommendation, sequential recommendation,
We implement 83 recommendation models covering general recommendation, sequential recommendation,
context-aware recommendation and knowledge-based recommendation. A brief introduction to these models are as follows:


Expand Down Expand Up @@ -122,7 +122,9 @@ Knowledge-based recommendation introduces an external knowledge graph to enhance
model/knowledge/cfkg
model/knowledge/ktup
model/knowledge/kgat
model/knowledge/kgin
model/knowledge/ripplenet
model/knowledge/mcclk
model/knowledge/mkr
model/knowledge/kgcn
model/knowledge/kgnnls
Expand Down
2 changes: 2 additions & 0 deletions recbole/model/knowledge_aware_recommender/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@
from recbole.model.knowledge_aware_recommender.cke import CKE
from recbole.model.knowledge_aware_recommender.kgat import KGAT
from recbole.model.knowledge_aware_recommender.kgcn import KGCN
from recbole.model.knowledge_aware_recommender.kgin import KGIN
from recbole.model.knowledge_aware_recommender.kgnnls import KGNNLS
from recbole.model.knowledge_aware_recommender.ktup import KTUP
from recbole.model.knowledge_aware_recommender.mcclk import MCCLK
from recbole.model.knowledge_aware_recommender.mkr import MKR
from recbole.model.knowledge_aware_recommender.ripplenet import RippleNet
Loading

0 comments on commit b7717f8

Please sign in to comment.