Skip to content

Latest commit

 

History

History
85 lines (46 loc) · 4.75 KB

ACL2020_PretrainLanguageModel.md

File metadata and controls

85 lines (46 loc) · 4.75 KB

预训练语言模型及部分应用

QuASE: Question-Answer Driven Sentence Encoding

TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data

Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

Toward Better Storylines with Sentence-Level Language Models

tBERT: Topic Models and BERT Joining Forces for Semantic Similarity Detection

FastBERT: a Self-distilling BERT with Adaptive Inference Time

Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering

Enhancing Pre-trained Chinese Character Representation with Word-aligned Attention

Span Selection Pre-training for Question Answering

DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference

MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices

Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning

Few-Shot NLG with Pre-Trained Language Model