Skip to content

A collection of papers related to speech model compression

Notifications You must be signed in to change notification settings

pyf98/speech-model-compression

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 

Repository files navigation

Speech Model Compression Papers

This repo presents a collection of recent papers related to speech model compression. Please feel free to suggest other papers!

2023

  • [INTERSPEECH] [arXiv] Task-Agnostic Structured Pruning of Speech Representation Models
  • [INTERSPEECH] [arXiv] [code] DPHuBERT: Joint Distillation and Pruning of Self-Supervised Speech Models
  • [ICASSP] [arXiv] Learning ASR pathways: A sparse multilingual ASR model
  • [ICASSP] [arXiv] I3D: Transformer Architectures with Input-Dependent Dynamic Depth for Speech Recognition
  • [ICASSP] [arXiv] Structured Pruning of Self-Supervised Pre-Trained Models for Speech Recognition and Understanding
  • [ICASSP] [arXiv] RobustDistiller: Compressing Universal Speech Representations for Enhanced Environment Robustness
  • [ICASSP] [arXiv] Ensemble Knowledge Distillation of Self-Supervised Speech Models

2022

  • [SLT] [arXiv] Learning a Dual-Mode Speech Recognition Model via Self-Pruning
  • [INTERSPEECH] [arXiv] [code] LightHuBERT: Lightweight and Configurable Speech Representation Learning with Once-for-All Hidden-Unit BERT
  • [INTERSPEECH] [arXiv] Deep versus Wide: An Analysis of Student Architectures for Task-Agnostic Knowledge Distillation of Self-Supervised Speech Models
  • [INTERSPEECH] [arXiv] [code] FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech Self-Supervised Learning
  • [ICASSP] [arXiv] [code] DistilHuBERT: Speech Representation Learning by Layer-wise Distillation of Hidden-unit BERT

2021

  • [NeurIPS] [arXiv] PARP: Prune, Adjust and Re-Prune for Self-Supervised Speech Recognition

About

A collection of papers related to speech model compression

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published