Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

无法用dashscope model server使用OpenAI SDK #493

Open
3 tasks done
wn1652400018 opened this issue Jun 15, 2024 · 13 comments
Open
3 tasks done

无法用dashscope model server使用OpenAI SDK #493

wn1652400018 opened this issue Jun 15, 2024 · 13 comments
Assignees
Labels
bug Something isn't working

Comments

@wn1652400018
Copy link

Initial Checks

  • I have searched GitHub for a duplicate issue and I'm sure this is something new
  • I have read and followed the docs & demos and still think this is a bug
  • I am confident that the issue is with modelscope-agent (not my code, or another library in the ecosystem)

What happened + What you expected to happen

sh scripts/run_assistant_server.sh --model-server dashscope成功启动。
当我使用curl命令是一切正常。但是当我使用OpenAI SDK with dashscope model server时直接报错。
在执行以下代码是出错:
`from openai import OpenAI
import os
os.environ['DASHSCOPE_API_KEY'] = 我的dashscope api key
api_base = "http://localhost:31512/v1/"
model = 'Qwen2-72B-Instruct'

tools = [{
"type": "function",
"function": {
"name": "amap_weather",
"description": "amap weather tool",
"parameters": [{
"name": "location",
"type": "string",
"description": "城市/区具体名称,如北京市海淀区请描述为海淀区",
"required": True
}]
}
}]

tool_choice = 'auto'

client = OpenAI(
base_url=api_base,
api_key="empty",

)
chat_completion = client.chat.completions.create(
messages=[{
"role": "user",
"content": "海淀区天气是什么?"
}],
model=model,
tools=tools,
tool_choice=tool_choice
)
`
报错结果:
python /root/modelscope-agent/temp.py
Traceback (most recent call last):
File "/root/modelscope-agent/temp.py", line 29, in
chat_completion = client.chat.completions.create(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 606, in create
return self._post(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
return self._request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1005, in _request
return self._retry_request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1053, in _retry_request
return self._request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1005, in _request
return self._retry_request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1053, in _retry_request
return self._request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Internal Server Error

server报错如下:
2024-06-15 17:49:22.000 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cca4bd80>
2024-06-15 17:49:22.018 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如北京市海淀区请描述为海淀区", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: 工具返回的结果\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:图片\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cca4bd80>
2024-06-15 17:49:22.019 - modelscope-agent - INFO - | message: call llm 1 times output: <generator object stream_output at 0x7fd2cca4bbc0>
INFO: 127.0.0.1:33044 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 136, in stat_last_call_token_info
'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
for chunk in result:
File "/root/modelscope-agent/modelscope_agent/agents/role_play.py", line 289, in _run
for s in output:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
for trunk in response:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 144, in stat_last_call_token_info
if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
Exception ignored in: <generator object HttpRequest._handle_request at 0x7fd2cca4be60>
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line -1, in run_asgi
RuntimeError: generator ignored GeneratorExit
2024-06-15 17:49:22.928 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cc9b4580>
2024-06-15 17:49:22.934 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如北京市海淀区请描述为海淀区", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: 工具返回的结果\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:图片\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cc9b4580>
2024-06-15 17:49:22.940 - modelscope-agent - INFO - | message: call llm 1 times output: <generator object stream_output at 0x7fd2cc9b4430>
INFO: 127.0.0.1:33052 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 136, in stat_last_call_token_info
'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
for chunk in result:
File "/root/modelscope-agent/modelscope_agent/agents/role_play.py", line 289, in _run
for s in output:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
for trunk in response:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 144, in stat_last_call_token_info
if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
Exception ignored in: <generator object HttpRequest._handle_request at 0x7fd2cc9b4660>
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line -1, in run_asgi
RuntimeError: generator ignored GeneratorExit
2024-06-15 17:49:24.608 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cc9b4120>
2024-06-15 17:49:24.613 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如北京市海淀区请描述为海淀区", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: 工具返回的结果\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:图片\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cc9b4120>
2024-06-15 17:49:24.617 - modelscope-agent - INFO - | message: call llm 1 times output: <generator object stream_output at 0x7fd2cc9b4200>
INFO: 127.0.0.1:33060 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 136, in stat_last_call_token_info
'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
for chunk in result:
File "/root/modelscope-agent/modelscope_agent/agents/role_play.py", line 289, in _run
for s in output:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
for trunk in response:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 144, in stat_last_call_token_info
if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
Exception ignored in: <generator object HttpRequest._handle_request at 0x7fd2cc9b4350>
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line -1, in run_asgi
RuntimeError: generator ignored GeneratorExit

Versions / Dependencies

git clone 最新的版本

Reproduction script

参上

Issue Severity

None

@wn1652400018 wn1652400018 added the bug Something isn't working label Jun 15, 2024
@mushenL
Copy link
Collaborator

mushenL commented Jun 17, 2024

@wn1652400018 您好,您这边需要将您的dashscope api key替换下面代码中empty的部分
client = OpenAI( base_url=api_base, api_key="empty", )

@wn1652400018
Copy link
Author

wn1652400018 commented Jun 17, 2024 via email

@BaLuoBooo
Copy link

我也遇到了这种情况,服务起起来之后,模型文件也没有被加载

@lisenjie757
Copy link

遇到了同样的问题,请问要如何解决

@BaLuoBooo
Copy link

我现在是用官方提供的vllm启动,然后使用Qwen-Agent的qwen_agent.llm里面的get_chat_model可以进行调用,在Qwen- Agent里面有调用案例

@lisenjie757
Copy link

我现在是用官方提供的vllm启动,然后使用Qwen-Agent的qwen_agent.llm里面的get_chat_model可以进行调用,在Qwen- Agent里面有调用案例

qwen-agent和modelscope-agent有什么区别嘛,这俩是只能跑qwen嘛,能不能跑其他开源大模型

@BaLuoBooo
Copy link

我现在是用官方提供的vllm启动,然后使用Qwen-Agent的qwen_agent.llm里面的get_chat_model可以进行调用,在Qwen- Agent里面有调用案例

qwen-agent和modelscope-agent有什么区别嘛,这俩是只能跑qwen嘛,能不能跑其他开源大模型

我只试过Qwen,其他的还没试过

@zzhangpurdue
Copy link
Collaborator

我现在是用官方提供的vllm启动,然后使用Qwen-Agent的qwen_agent.llm里面的get_chat_model可以进行调用,在Qwen- Agent里面有调用案例

qwen-agent和modelscope-agent有什么区别嘛,这俩是只能跑qwen嘛,能不能跑其他开源大模型

modelscope-agent支持各种模型。 qwen-agent围绕qwen能力进行构建。

@zzhangpurdue
Copy link
Collaborator

我也遇到了这种情况,服务起起来之后,模型文件也没有被加载

您好,您这边需要将您的dashscope api key替换下面代码中empty的部分
client = OpenAI( base_url=api_base, api_key="empty", )

@zzhangpurdue
Copy link
Collaborator

遇到了同样的问题,请问要如何解决

您好,您这边需要将您的dashscope api key替换下面代码中empty的部分
client = OpenAI( base_url=api_base, api_key="empty", )

@zzhangpurdue
Copy link
Collaborator

zzhangpurdue commented Jul 10, 2024

please replace the "empty" in the following with your code by dashscope api key
client = OpenAI( base_url=api_base, api_key="empty", )

and make sure the model name set as dashscope model name such as 'qwen-max'
make sure modelscope-agent>=0.6.2
A positive example would be:

run sh scripts/run_assistant_server.sh --model-server dashscope to start a tool calling service on port 31512

then use the following code to call it with openai sdk

`from openai import OpenAI
import os
api_base = "http://localhost:31512/v1/"
model = 'qwen-max'

tools = [{
"type": "function",
"function": {
"name": "amap_weather",
"description": "amap weather tool",
"parameters": [{
"name": "location",
"type": "string",
"description": "城市/区具体名称,如北京市海淀区请描述为海淀区",
"required": True
}]
}
}]

tool_choice = 'auto'

client = OpenAI(
base_url=api_base,
api_key="########",

)
chat_completion = client.chat.completions.create(
messages=[{
"role": "user",
"content": "海淀区天气是什么?"
}],
model=model,
tools=tools,
tool_choice=tool_choice
)`

@lisenjie757
Copy link

遇到了同样的问题,请问要如何解决

您好,您这边需要将您的dashscope api key替换下面代码中empty的部分 client = OpenAI( base_url=api_base, api_key="empty", )

@zzhangpurdue 请问dashscope api key是什么,有什么用,要如何获得,我就是想用modelscope-agent本地跑个开源模型,使其具有function call的能力,用已经下载的权重,我参照这个文档https://github.com/modelscope/modelscope-agent/blob/master/docs/llms/qwen2_tool_calling.md

运行如下指令

sh scripts/run_assistant_server.sh --served-model-name Qwen2-7B-Instruct --model path/to/weights

启动服务后运行文档中的openai sdk demo

from openai import OpenAI
api_base = "http://localhost:31512/v1/"
model = 'Qwen2-7B-Instruct'

tools = [{
    "type": "function",
    "function": {
        "name": "amap_weather",
        "description": "amap weather tool",
        "parameters": [{
            "name": "location",
            "type": "string",
            "description": "城市/区具体名称,如`北京市海淀区`请描述为`海淀区`",
            "required": True
        }]
    }
}]

tool_choice = 'auto'

client = OpenAI(
    base_url=api_base,
    api_key="empty",
)
chat_completion = client.chat.completions.create(
    messages=[{
        "role": "user",
        "content": "海淀区天气是什么?"
    }],
    model=model,
    tools=tools,
    tool_choice=tool_choice
)

但是报了如下与楼主一样的错误

scripts/run_assistant_server.sh: 16: [[: not found
Model name: 
Model directory: 
Model server: 
Running fastapi assistant server at port 31512.
Running FastAPI assistant server at port 31512 as default.
Starting install nltk data...
nltk data installed.
setting nltk data path to: /mnt/nas/lsj/workspace/modelscope-agent/tmp/nltk_data
/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_fields.py:184: UserWarning: Field name "function_map" shadows an attribute in parent "Agent"; 
  warnings.warn(
/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_server" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
INFO:     Started server process [254643]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:31512 (Press CTRL+C to quit)
2024-07-10 16:42:29.239 - modelscope-agent - INFO -  | message: call dashscope generation api | uuid:  | details: {'model': 'Qwen2-7B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step:  | error: 
2024-07-10 16:42:29.340 - modelscope-agent - INFO -  | message: call dashscope generation api | uuid:  | details: {'model': 'Qwen2-7B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如`北京市海淀区`请描述为`海淀区`", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: <result>工具返回的结果</result>\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:![图片](url)\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step:  | error: 
2024-07-10 16:42:29.340 - modelscope-agent - INFO -  | message: call llm 1 times output: <generator object stream_output at 0x7fb414a8c200>
INFO:     127.0.0.1:39754 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 134, in stat_last_call_token_info
    'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 419, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/opt/conda/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 762, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 782, in app
    await route.handle(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 299, in app
    raise e
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 294, in app
    raw_response = await run_endpoint_function(
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
    for chunk in result:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/agents/role_play.py", line 287, in _run
    for s in output:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
    for trunk in response:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 142, in stat_last_call_token_info
    if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
2024-07-10 16:42:30.671 - modelscope-agent - INFO -  | message: call dashscope generation api | uuid:  | details: {'model': 'Qwen2-7B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step:  | error: 
2024-07-10 16:42:30.671 - modelscope-agent - INFO -  | message: call dashscope generation api | uuid:  | details: {'model': 'Qwen2-7B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如`北京市海淀区`请描述为`海淀区`", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: <result>工具返回的结果</result>\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:![图片](url)\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step:  | error: 
2024-07-10 16:42:30.868 - modelscope-agent - INFO -  | message: call llm 1 times output: <generator object stream_output at 0x7fb4141d1460>
INFO:     127.0.0.1:39812 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 134, in stat_last_call_token_info
    'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 419, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/opt/conda/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 762, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 782, in app
    await route.handle(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 299, in app
    raise e
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 294, in app
    raw_response = await run_endpoint_function(
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
    for chunk in result:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/agents/role_play.py", line 287, in _run
    for s in output:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
    for trunk in response:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 142, in stat_last_call_token_info
    if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
2024-07-10 16:42:32.540 - modelscope-agent - INFO -  | message: call dashscope generation api | uuid:  | details: {'model': 'Qwen2-7B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step:  | error: 
2024-07-10 16:42:32.541 - modelscope-agent - INFO -  | message: call dashscope generation api | uuid:  | details: {'model': 'Qwen2-7B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如`北京市海淀区`请描述为`海淀区`", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: <result>工具返回的结果</result>\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:![图片](url)\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step:  | error: 
2024-07-10 16:42:32.640 - modelscope-agent - INFO -  | message: call llm 1 times output: <generator object stream_output at 0x7fb4141d1770>
INFO:     127.0.0.1:39906 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 134, in stat_last_call_token_info
    'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 419, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/opt/conda/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 762, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 782, in app
    await route.handle(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 299, in app
    raise e
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 294, in app
    raw_response = await run_endpoint_function(
  File "/opt/conda/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
    for chunk in result:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/agents/role_play.py", line 287, in _run
    for s in output:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
    for trunk in response:
  File "/mnt/nas/lsj/workspace/modelscope-agent/modelscope_agent/llm/dashscope.py", line 142, in stat_last_call_token_info
    if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'

@zzhangpurdue
Copy link
Collaborator

zzhangpurdue commented Jul 10, 2024

你这里需要把模型本地地址替换 --model path/to/weights
sh scripts/run_assistant_server.sh --served-model-name Qwen2-7B-Instruct --model path/to/weights
他没有找到本地地址就默认走dashscope了,dashscope是阿里的大模型api。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

6 participants