A treasure chest for visual classification and recognition powered by PaddlePaddle
-
Updated
Dec 19, 2024 - Python
A treasure chest for visual classification and recognition powered by PaddlePaddle
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
This is a collection of our NAS and Vision Transformer work.
Pytorch implementation of various Knowledge Distillation (KD) methods.
OpenMMLab Model Compression Toolbox and Benchmark.
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
A curated list for Efficient Large Language Models
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Segmind Distilled diffusion
Code and resources on scalable and efficient Graph Neural Networks
[ICCV 2023] MI-GAN: A Simple Baseline for Image Inpainting on Mobile Devices
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."