Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

救命!!ChatGlm-v2-6b_Lora该怎么设置epoch?? #160

Open
fengzehui0422 opened this issue Nov 15, 2023 · 1 comment
Open

救命!!ChatGlm-v2-6b_Lora该怎么设置epoch?? #160

fengzehui0422 opened this issue Nov 15, 2023 · 1 comment

Comments

@fengzehui0422
Copy link

No description provided.

@fengzehui0422 fengzehui0422 changed the title 救命!!ChatGlm-v2-6设置epoch 救命!!ChatGlm-v2-6b_Lora该怎么设置epoch?? Nov 15, 2023
@fengzehui0422
Copy link
Author

export CUDA_VISIBLE_DEVICES=0
python main.py
--do_train
--train_file D:/LLM/yuanzhoulvpi/AdvertiseGen/train.json
--validation_file D:/LLM/yuanzhoulvpi/AdvertiseGen/dev.json
--preprocessing_num_workers 10
--prompt_column content
--response_column summary
--overwrite_cache
--model_name_or_path chatglm2-6b_model
--output_dir output/adgen-chatglm2-6b-lora_version
--overwrite_output_dir
--max_source_length 64
--max_target_length 128
--per_device_train_batch_size 1
--per_device_eval_batch_size 1
--gradient_accumulation_steps 16
--predict_with_generate
--max_steps 3000
--logging_steps 10
--save_steps 100
--learning_rate 2e-5
--lora_r 32
--model_parallel_mode True
我看这里也没有epoch轮数,难道Lora只能训练一次?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant