Skip to content

能推理但是merge完部署后报错无法找到 adapter_config.json #3643

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wookpeckerjohn opened this issue Mar 25, 2025 · 1 comment
Closed

Comments

@wookpeckerjohn
Copy link

wookpeckerjohn commented Mar 25, 2025

ERROR:asyncio:Exception in callback functools.partial(<function VllmEngine.patch_remove_log..new_log_task_completion at 0x7fd8601d8d30>, error_callback=<bound method AsyncLLMEngine._error_callback of <vllm.engine.async_llm_engine.AsyncLLMEngine object at 0x7fd88c093040>>)
handle: <Handle functools.partial(<function VllmEngine.patch_remove_log..new_log_task_completion at 0x7fd8601d8d30>, error_callback=<bound method AsyncLLMEngine._error_callback of <vllm.engine.async_llm_engine.AsyncLLMEngine object at 0x7fd88c093040>>)>
Traceback (most recent call last):
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/lora/worker_manager.py", line 101, in _load_adapter
peft_helper = PEFTHelper.from_local_dir(
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/lora/peft_helper.py", line 96, in from_local_dir
with open(lora_config_path) as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/octopus/work/category/Train/ms-swift/Output/v10-20250321-154400/checkpoint-6708-merged/adapter_config.json'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "uvloop/cbhandles.pyx", line 63, in uvloop.loop.Handle._run
File "/home/powerop/work/conda/envs/train/lib/python3.10/site-packages/swift/llm/infer/infer_engine/vllm_engine.py", line 481, in new_log_task_completion
return_value = task.result()
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 825, in run_engine_loop
result = task.result()
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 748, in engine_step
request_outputs = await self.engine.step_async(virtual_engine)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/engine/async_llm_engine.py", line 353, in step_async
outputs = await self.model_executor.execute_model_async(
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/executor/executor_base.py", line 250, in execute_model_async
output = await make_async(self.execute_model)(execute_model_req)
File "/home/powerop/work/conda/envs/train/lib/python3.10/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/executor/executor_base.py", line 139, in execute_model
output = self.collective_rpc("execute_model",
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/executor/uniproc_executor.py", line 56, in collective_rpc
answer = run_method(self.driver_worker, method, args, kwargs)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/utils.py", line 2196, in run_method
return func(*args, **kwargs)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/worker/worker_base.py", line 420, in execute_model
output = self.model_runner.execute_model(
File "/home/powerop/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/worker/model_runner.py", line 1661, in execute_model
self.set_active_loras(model_input.lora_requests,
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/worker/model_runner.py", line 1363, in set_active_loras
self.lora_manager.set_active_adapters(lora_requests, lora_mapping)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/lora/worker_manager.py", line 165, in set_active_adapters
set_active_adapters_worker(requests, mapping, self._apply_adapters,
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/adapter_commons/utils.py", line 54, in set_active_adapters_worker
apply_adapters_func(requests)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/lora/worker_manager.py", line 225, in _apply_adapters
self.add_adapter(lora)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/lora/worker_manager.py", line 233, in add_adapter
lora = self._load_adapter(lora_request)
File "/home/powerop/.local/lib/python3.10/site-packages/vllm/lora/worker_manager.py", line 134, in _load_adapter
raise ValueError(
ValueError: Loading lora _lora failed: No adapter found for /home/octopus/work/category/Train/ms-swift/Output/v10-20250321-154400/checkpoint-6708-merged

@Jintao-Huang
Copy link
Collaborator

use --model instead of --adapters

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants