Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to pretrain llama2? #1418

Closed
wen020 opened this issue May 14, 2024 · 4 comments
Closed

how to pretrain llama2? #1418

wen020 opened this issue May 14, 2024 · 4 comments

Comments

@wen020
Copy link

wen020 commented May 14, 2024

We read the pretrain code in this repo and find that the pretrain code is gpt model. how to pretrain llama2?
image

@fireyanci
Copy link

You can read the tutorials

@rasbt
Copy link
Collaborator

rasbt commented May 20, 2024

Actually, the GPT here is just a Python class. All other LLMs are very similar to GPT, which is why they are derived from that class in code. To choose the LLM you want to pretrain, check the pretrain tutorials here: https://github.com/Lightning-AI/litgpt/blob/main/tutorials/pretrain.md

E.g., litgpt pretrain ... --model_name Llama-2-7b-hf ...

@wen020
Copy link
Author

wen020 commented May 21, 2024

ok

@rasbt
Copy link
Collaborator

rasbt commented May 21, 2024

I hope this helped. Please feel free to reopen if you have a follow up question.

@rasbt rasbt closed this as completed May 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants