中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
-
Updated
May 23, 2024 - Python
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
A serverless architecture for orchestrating ETL jobs in arbitrarily-complex workflows using AWS Step Functions and AWS Lambda.
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)
⛵️The official PyTorch implementation for "BERT-of-Theseus: Compressing BERT by Progressive Module Replacing" (EMNLP 2020).
AWS tutorial code.
ALBERT model Pretraining and Fine Tuning using TF2.0
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
Implementation of XLNet that can load pretrained checkpoints
Datasets collection and preprocessings framework for NLP extreme multitask learning
Build and deploy a serverless data pipeline on AWS with no effort.
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
IndoLEM is a comprehensive Indonesian NLU benchmark, comprising three pillars NLP task: morpho-syntax, semantic, and discourse. Presented in COLING 2020.
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices
基于bert4keras的GLUE基准代码
Add a description, image, and links to the glue topic page so that developers can more easily learn about it.
To associate your repository with the glue topic, visit your repo's landing page and select "manage topics."