Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Debertav2 support? #13

Open
comchobo opened this issue May 24, 2023 · 2 comments
Open

Debertav2 support? #13

comchobo opened this issue May 24, 2023 · 2 comments

Comments

@comchobo
Copy link

Hello. I'm trying to use your method for sentence embedding. I thought debertav3 is a really good BERT-style model and tried to implement with your code. However, It seems a bit tough for me. Do you have any solution/plan to implement DebertaV3 with your code?, like, implementing DebertaForCL on models.py

@YJiangcm
Copy link
Owner

Hi, I think you may try to change 'Roberta' into 'DebertaV2' in models.py.

e.g., change
from transformers.models.roberta.modeling_roberta import RobertaPreTrainedModel, RobertaModel, RobertaLMHead
to
from transformers.models.deberta_v2.modeling_deberta_v2 import DebertaV2PreTrainedModel, DebertaV2Model, DebertaV2LMHead,
and create a DebertaForCL class.

Hope it works.

@comchobo
Copy link
Author

I think, because of attention mechanism in debertav2, it is not easy to implement debertav2 prefix based model. It is not supported in peft also, which implemented most of light training methods, including prefix based model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants