Skip to content
@MoleculeTransformers

Molecule Transformers

Molecule Transformers is a collection of recipes for pre-training and fine-tuning molecular transformer language models, including BART, BERT, etc.

Popular repositories Loading

  1. smiles-featurizers smiles-featurizers Public

    Extract Molecular SMILES embeddings from language models pre-trained with various objectives architectures.

    Python 16 1

  2. moleculenet-smiles-bert-mixup moleculenet-smiles-bert-mixup Public

    Training pre-trained BERT language model on molecular SMILES from the Molecule Net benchmark by leveraging mixup and enumeration augmentations.

    Python 3 1

  3. moleculenet-bert-ssl moleculenet-bert-ssl Public

    Semi-supervised learning techniques (pseudo-label, mixmatch, and co-training) for pre-trained BERT language model amidst low-data regime based on molecular SMILES from the Molecule Net benchmark.

    Python 2

  4. smiles-augment smiles-augment Public

    Augment molecular SMILES with methods including enumeration, and mixup, for low-data regime settings for downstream supervised drug discovery tasks.

    Python 2 1

  5. moleculetransformers.github.io moleculetransformers.github.io Public

    Documentation for the Molecule Transformers.

    2

  6. rdkit-benchmarking-platform-transformers rdkit-benchmarking-platform-transformers Public

    Port of RDKit Benchmarking platform for pre-trained transformers-based language models for virtual screening drug discovery task.

    Python

Repositories

Showing 6 of 6 repositories
  • smiles-featurizers Public

    Extract Molecular SMILES embeddings from language models pre-trained with various objectives architectures.

    MoleculeTransformers/smiles-featurizers’s past year of commit activity
    Python 16 Apache-2.0 1 1 0 Updated Nov 9, 2023
  • moleculetransformers.github.io Public

    Documentation for the Molecule Transformers.

    MoleculeTransformers/moleculetransformers.github.io’s past year of commit activity
    2 Apache-2.0 0 0 0 Updated Jul 23, 2023
  • moleculenet-bert-ssl Public

    Semi-supervised learning techniques (pseudo-label, mixmatch, and co-training) for pre-trained BERT language model amidst low-data regime based on molecular SMILES from the Molecule Net benchmark.

    MoleculeTransformers/moleculenet-bert-ssl’s past year of commit activity
    Python 2 Apache-2.0 0 0 0 Updated Mar 17, 2023
  • rdkit-benchmarking-platform-transformers Public

    Port of RDKit Benchmarking platform for pre-trained transformers-based language models for virtual screening drug discovery task.

    MoleculeTransformers/rdkit-benchmarking-platform-transformers’s past year of commit activity
    Python 0 0 0 0 Updated Dec 29, 2022
  • smiles-augment Public

    Augment molecular SMILES with methods including enumeration, and mixup, for low-data regime settings for downstream supervised drug discovery tasks.

    MoleculeTransformers/smiles-augment’s past year of commit activity
    Python 2 Apache-2.0 1 1 0 Updated Dec 28, 2022
  • moleculenet-smiles-bert-mixup Public

    Training pre-trained BERT language model on molecular SMILES from the Molecule Net benchmark by leveraging mixup and enumeration augmentations.

    MoleculeTransformers/moleculenet-smiles-bert-mixup’s past year of commit activity
    Python 3 Apache-2.0 1 0 0 Updated Dec 23, 2022

Top languages

Loading…

Most used topics

Loading…