Skip to content

Contrastive learning of structure-activity relationship

Notifications You must be signed in to change notification settings

shenwanxiang/ACANet

Repository files navigation

Activity Cliff Awareness

Dataset Paper Paper Codeocean PyPI version Downloads


Code repository for activity cliff-awareness (ACA) loss and graph-based ACANet model


About

1) ACALoss

This study proposes the activity-cliff-awareness (ACA) loss for improving molecular activity prediction by deep learning models. The ACA loss enhances both metric learning in the latent space and task learning in the target space during training, making the network aware of the activity-cliff issue. For more details, please refer to the paper titled "Online triplet contrastive learning enables efficient cliff awareness in molecular activity prediction."

**Comparison of models for molecular activity prediction, one without (left) and one with (right) activity cliff awareness (ACA).** The left panel depicts a model without ACA, where the presence of an activity cliff triplet (A, P, N) creates a challenge for the model to learn in the latent space. Due to the similarity between A and N in their chemical structures, the model learns graph representations that result in the distance between A-P being far greater than A-N, leading to poor training and prediction results. However, the right panel shows a model with ACA that optimizes the latent vectors in the latent space, making A closer to P and further away from N. The model with ACA combines metric learning in the latent space with minimizing the error for regression learning, while the model without ACA only focuses on the regression loss and may not effectively handle activity cliffs.

2) ACANet

ACANet is a deep learning model that developed based on the proposed ACALoss and graph neural network. It can tune the hyperparameters of ACALoss automatically, and provides a high-level interface of training and test in the deep learning model (Users can use it just like scikit-learn)

Model performance of with and without AC-Awareness

ACA loss vs. MAE loss on external test set and on No. of mined triplets during the training:

More details on usage and performance can be found here.

ACA loss implementation

ACA loss usage

#Pytorch
from clsar.model.loss import ACALoss
aca_loss = ACALoss(alpha=0.1, cliff_lower = 0.2, cliff_upper = 1.0, p = 1., squared = False)
loss = aca_loss(labels,  predictions, embeddings)
loss.backward()


#Tensorflow
from clsar.model.loss_tf import ACALoss

Installation

pip install clsar

Run ACANet

from clsar import ACANet
#Xs_train: list of SMILES string of training set
#y_train_pIC50: the pChEMBL labels of training set

## init ACANet
clf = ACANet(gpuid = 0,   work_dir = './')

## get loss hyperparameters (cliff_lower, cliff_upper, and alpha) by training set 
dfp = clf.opt_cliff_by_cv(Xs_train, y_train_pIC50, total_epochs=50, n_repeats=3)
dfa = clf.opt_alpha_by_cv(Xs_train, y_train_pIC50, total_epochs=100, n_repeats=3)


## fit model using 5fold cross-validation
clf.cv_fit(Xs_train, y_train_pIC50, verbose=1)


## make prediction using the 5-submodels, the outputs are the average of the 5-submodels
test_pred_pIC50 = clf.cv_predict(Xs_test)

Citation

Wan Xiang Shen*, Chao Cui*, Yu Zong Chen et al. Online triplet contrastive learning enables efficient cliff awareness in molecular activity prediction, 28 June 2023, PREPRINT (Version 1) available at Research Square [https://doi.org/10.21203/rs.3.rs-2988283/v1].