Add adapter support for Hubert #551
Open
+362
−30
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR adds adapter support for Hubert by Meta. I have a few questions:
Should I also add support for LoRA? It expectsAdded support for LoRA too. Let me know if it looks fine?PretrainedConfig
inLoRALinear
which is not imported inmodeling_hubert.py
file. Should I add that and then support LoRA too?adjust_tensors_for_parallel
is used for what purpose? I see you have used this while calculating attention.src/transformers/adapters/models/hubert/adapter_model.py
there are many heads and for this ASR model I found onlyClassificationHead
,ModelWithFlexibleHeadsAdaptersMixin
,MultiLabelClassificationHead
, andMultipleChoiceHead
useful. Let me know if that's fine else I can add head for other tasks tooI'm working on tests, will finalise after first review. Thanks a lot!