Skip to content

Code for the SCCUR 2021 oral talk: Measuring the Efficiency of Pre-Trained English Tokenizers for the Optimization of NLP Models.

License

Notifications You must be signed in to change notification settings

riybha216/SCCUR2021-TokenizerEval

Repository files navigation

About

Code for the SCCUR 2021 oral talk: Measuring the Efficiency of Pre-Trained English Tokenizers for the Optimization of NLP Models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •