Pytorch implementation of various Knowledge Distillation (KD) methods.
knowledge-distillation
teacher-student
knowledge-transfer
model-compression
distillation
kd
kd-methods
-
Updated
Nov 25, 2021 - Python
Pytorch implementation of various Knowledge Distillation (KD) methods.
Add a description, image, and links to the kd-methods topic page so that developers can more easily learn about it.
To associate your repository with the kd-methods topic, visit your repo's landing page and select "manage topics."