Distilling Wisdom: A Review on Optimizing Learning From Massive Language Models

<p dir="ltr">In the era of Large Language Models (LLMs), Knowledge Distillation (KD) enables the transfer of capabilities from proprietary LLMs to open-source models. This survey provides a detailed discussion of the basic principles, algorithms, and implementation methods of knowled...

Full description

Saved in:
Bibliographic Details
Main Author: Dingzong Zhang (23275066) (author)
Other Authors: Devi Listiyani (23275069) (author), Priyanka Singh (256412) (author), Manoranjan Mohanty (23275072) (author)
Published: 2025
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!