Skip to content

julianmack/knowledge-distill

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge Distillation

For current results and training commands see Results Notebook.

Setup and Installation

  1. Install deps with miniconda:

    conda env create -f environment.yml
    conda activate distill
    pip install -e .
  2. Download GloVe embeddings. The Glove directory can be in any location but the default is ./model.

  3. Download (or train) a teacher fast-berst model and place contents in ./model.

  4. Place csv data for training/evaluation in ./data directory.

About

Knowledge-distillation of CamemBERT

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published