Materials presented at CU Machine Learning Reading Group March 2019, organized by Debashis Ghosh, PhD.
My presentation is based on the paper:
Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I. Language Models are Unsupervised Multitask Learners. Available at: https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf
A blog post by OpenAI: https://blog.openai.com/better-language-models/
The GPT-2 github repository: https://github.com/openai/gpt-2
This work is an extension of OpenAI's previous work, I highly recommend reading this as well:
Radford A, Narasimhan K, Salimans T, Sutskever I. Improving Language Understanding by Generative Pre-Training. Available at: https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf