STACKQUADRANT

Tebmer/Awesome-Knowledge-Distillation-of-LLMs

LLM Frameworks

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

3.9
GitHub Metrics
Stars
1.3k
Forks
73
Open Issues
2
Watchers
17
Contributors
2
Weekly Commits
0
Language
License
Last Commit
Mar 9, 2025
Created
Feb 8, 2024
Latest Release
Release Date
Synced: Apr 16, 2026
Quality Scores
Documentation Qualityw: 20%
4.4

No dedicated docs site. Description: 226 chars. Stars signal: 1,273. Contributors: 2. Score: 4.4/10

Community Healthw: 20%
4.2

Stars: 1,273. Contributors: 2. Watchers: 17. Forks: 73. Issue ratio: 0.2%. Score: 4.2/10

Maintenance Velocityw: 15%
1.0

Last commit: 403d ago. Weekly commits: 0. No releases published. Maturity bonus: 2.2y old. Score: 1/10

API Design & DXw: 20%
6.1

Stars/issues ratio: 637. No dedicated API docs. No license specified. Popularity signal: 1,273 stars. Score: 6.1/10

Production Readinessw: 15%
3.1

Battle-tested: 1,273 stars. Peer review: 2 contributors. No versioned releases. No license (risky for production). Age: 2.2 years. Maintenance: last commit 403d ago. Score: 3.1/10

Ecosystem Integrationw: 10%
3.9

Fork interest: 73. No license (integration risk). Adoption: 1,273 stars. Score: 3.9/10

Tags
alignmentcompressiondata-augmentationdata-synthesisfeedbackinstruction-followingkdknowledge-distillationlarge-language-modelllm
Radar
Documentation Quality
Community Health
Maintenance Velocity
API Design & DX
Production Readiness
Ecosystem Integration