Awesome Knowledge Distillation
-
Updated
Nov 27, 2024
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Pytorch implementation of various Knowledge Distillation (KD) methods.
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
模型压缩的小白入门教程
[ECCV2022] Factorizing Knowledge in Neural Networks
Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.
Awesome-3D/Multimodal-Anomaly-Detection-and-Localization/Segmentation/3D-KD/3D-knowledge-distillation
Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.
Matching Guided Distillation (ECCV 2020)
Rotated Localization Distillation (CVPR 2022, TPAMI 2023)
A simple script to convert Agilent 845x Chemstation UV-Vis files (.KD or .SD formats) to .csv format. Fast and easy!
Pluto notebook for curve fitting
Documentation for Ki Libraries and Languages
This is a Kotlin implementation of the KD language. It is feature complete and passes all tests.
Add a description, image, and links to the kd topic page so that developers can more easily learn about it.
To associate your repository with the kd topic, visit your repo's landing page and select "manage topics."