Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

是否支持全参微调? #180

Open
WuChannn opened this issue Aug 15, 2024 · 2 comments
Open

是否支持全参微调? #180

WuChannn opened this issue Aug 15, 2024 · 2 comments
Assignees

Comments

@WuChannn
Copy link

现在放出的微调接口是Lora微调的,是否可以在训练资源充足的情况下进行全参微调呢?是不是注释掉

peft_lora.py中的model = get_peft_model(model, peft_config),就可以了?

@zRzRzRzRzRzRzR zRzRzRzRzRzRzR self-assigned this Aug 17, 2024
@zRzRzRzRzRzRzR
Copy link
Member

全参微调现在的环境带不动,这个方案带不动全参,显存不够

@WuChannn
Copy link
Author

我这边资源应该是够的,所以是不是注释掉peft_lora.py中的model = get_peft_model(model, peft_config),就可以了?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants