Research on compressing BERT with low-rank factorization and knowledge distillation
svd
bert
tucker-decomposition
low-rank-factorizations
teacher-model
teacher-prediction
student-model
-
Updated
Nov 18, 2022 - Python
Research on compressing BERT with low-rank factorization and knowledge distillation
Add a description, image, and links to the student-model topic page so that developers can more easily learn about it.
To associate your repository with the student-model topic, visit your repo's landing page and select "manage topics."