Tebmer/Awesome-Knowledge-Distillation-of-LLMs
LLM FrameworksThis repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
No dedicated docs site. Description: 226 chars. Stars signal: 1,273. Contributors: 2. Score: 4.4/10
Stars: 1,273. Contributors: 2. Watchers: 17. Forks: 73. Issue ratio: 0.2%. Score: 4.2/10
Last commit: 403d ago. Weekly commits: 0. No releases published. Maturity bonus: 2.2y old. Score: 1/10
Stars/issues ratio: 637. No dedicated API docs. No license specified. Popularity signal: 1,273 stars. Score: 6.1/10
Battle-tested: 1,273 stars. Peer review: 2 contributors. No versioned releases. No license (risky for production). Age: 2.2 years. Maintenance: last commit 403d ago. Score: 3.1/10
Fork interest: 73. No license (integration risk). Adoption: 1,273 stars. Score: 3.9/10