You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
可以正常训练,但是训练后无法使用如下命令进行merge lora.
CUDA_VISIBLE_DEVICES=0 swift export
--ckpt_dir '/root/hao/ocr/result/id_internvl_0106_v18/v2-20250106-164114/checkpoint-1500' --merge_lora true
报错如下:
From 👉v4.50👈 onwards, PreTrainedModel will NOT inherit from GenerationMixin, and this model will lose the ability to call generate and other related functions.
If you are the owner of the model architecture code, please modify your model class such that it inherits from GenerationMixin (after PreTrainedModel, otherwise you'll get an exception).
If you are not the owner of the model architecture class, please contact the model code owner to update it.
Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:05<00:00, 1.36s/it]
[INFO:swift] default_system: 你是由上海人工智能实验室联合商汤科技开发的书生多模态大模型,英文名叫InternVL, 是一个有用无害的人工智能助手。
[INFO:swift] Merge LoRA...
[INFO:swift] Saving merged weights...
[2025-01-07 15:25:24,141] [INFO] [real_accelerator.py:222:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/cli/export.py", line 5, in
export_main()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/export/export.py", line 41, in export_main
return SwiftExport(args).main()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/base.py", line 45, in main
result = self.run()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/export/export.py", line 24, in run
merge_lora(args)
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/export/merge_lora.py", line 42, in merge_lora
save_checkpoint(
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/utils.py", line 206, in save_checkpoint
model.save_pretrained(output_dir, safe_serialization=safe_serialization, max_shard_size=max_shard_size)
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2815, in save_pretrained
misplaced_generation_parameters = model_to_save.config._get_non_default_generation_parameters()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/transformers/configuration_utils.py", line 1063, in _get_non_default_generation_parameters
default_config = self.class()
File "/root/.cache/huggingface/modules/transformers_modules/internvl2_8b/configuration_internvl_chat.py", line 50, in init
if llm_config.get('architectures')[0] == 'LlamaForCausalLM':
TypeError: 'NoneType' object is not subscriptable
The text was updated successfully, but these errors were encountered:
可以正常训练,但是训练后无法使用如下命令进行merge lora.
CUDA_VISIBLE_DEVICES=0 swift export
--ckpt_dir '/root/hao/ocr/result/id_internvl_0106_v18/v2-20250106-164114/checkpoint-1500' --merge_lora true
报错如下:
From 👉v4.50👈 onwards,
PreTrainedModel
will NOT inherit fromGenerationMixin
, and this model will lose the ability to callgenerate
and other related functions.trust_remote_code=True
, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classesGenerationMixin
(afterPreTrainedModel
, otherwise you'll get an exception).Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:05<00:00, 1.36s/it]
[INFO:swift] default_system: 你是由上海人工智能实验室联合商汤科技开发的书生多模态大模型,英文名叫InternVL, 是一个有用无害的人工智能助手。
[INFO:swift] Merge LoRA...
[INFO:swift] Saving merged weights...
[2025-01-07 15:25:24,141] [INFO] [real_accelerator.py:222:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/cli/export.py", line 5, in
export_main()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/export/export.py", line 41, in export_main
return SwiftExport(args).main()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/base.py", line 45, in main
result = self.run()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/export/export.py", line 24, in run
merge_lora(args)
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/export/merge_lora.py", line 42, in merge_lora
save_checkpoint(
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/swift/llm/utils.py", line 206, in save_checkpoint
model.save_pretrained(output_dir, safe_serialization=safe_serialization, max_shard_size=max_shard_size)
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2815, in save_pretrained
misplaced_generation_parameters = model_to_save.config._get_non_default_generation_parameters()
File "/root/miniforge3/envs/hao_swift/lib/python3.10/site-packages/transformers/configuration_utils.py", line 1063, in _get_non_default_generation_parameters
default_config = self.class()
File "/root/.cache/huggingface/modules/transformers_modules/internvl2_8b/configuration_internvl_chat.py", line 50, in init
if llm_config.get('architectures')[0] == 'LlamaForCausalLM':
TypeError: 'NoneType' object is not subscriptable
The text was updated successfully, but these errors were encountered: