-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
无法用dashscope model server使用OpenAI SDK #493
Comments
@wn1652400018 您好,您这边需要将您的dashscope api key替换下面代码中empty的部分 |
影响中我这么尝试过,同样没有用。我待会再试一试。
…---原始邮件---
发件人: ***@***.***>
发送时间: 2024年6月17日(周一) 下午4:46
收件人: ***@***.***>;
抄送: ***@***.******@***.***>;
主题: Re: [modelscope/modelscope-agent] 无法用dashscope model server使用OpenAI SDK (Issue #493)
@wn1652400018 您好,您这边需要将您的dashscope api key替换下面代码中empty的部分
client = OpenAI( base_url=api_base, api_key="empty", )
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
我也遇到了这种情况,服务起起来之后,模型文件也没有被加载 |
遇到了同样的问题,请问要如何解决 |
|
qwen-agent和modelscope-agent有什么区别嘛,这俩是只能跑qwen嘛,能不能跑其他开源大模型 |
我只试过Qwen,其他的还没试过 |
modelscope-agent支持各种模型。 qwen-agent围绕qwen能力进行构建。 |
您好,您这边需要将您的dashscope api key替换下面代码中empty的部分 |
您好,您这边需要将您的dashscope api key替换下面代码中empty的部分 |
please replace the "empty" in the following with your code by dashscope api key and make sure the model name set as dashscope model name such as 'qwen-max' run then use the following code to call it with openai sdk `from openai import OpenAI tools = [{ tool_choice = 'auto' client = OpenAI( ) |
@zzhangpurdue 请问dashscope api key是什么,有什么用,要如何获得,我就是想用modelscope-agent本地跑个开源模型,使其具有function call的能力,用已经下载的权重,我参照这个文档https://github.com/modelscope/modelscope-agent/blob/master/docs/llms/qwen2_tool_calling.md 运行如下指令
启动服务后运行文档中的openai sdk demo
但是报了如下与楼主一样的错误
|
你这里需要把模型本地地址替换 --model path/to/weights |
Initial Checks
What happened + What you expected to happen
sh scripts/run_assistant_server.sh --model-server dashscope成功启动。
当我使用curl命令是一切正常。但是当我使用OpenAI SDK with dashscope model server时直接报错。
在执行以下代码是出错:
`from openai import OpenAI
import os
os.environ['DASHSCOPE_API_KEY'] = 我的dashscope api key
api_base = "http://localhost:31512/v1/"
model = 'Qwen2-72B-Instruct'
tools = [{
"type": "function",
"function": {
"name": "amap_weather",
"description": "amap weather tool",
"parameters": [{
"name": "location",
"type": "string",
"description": "城市/区具体名称,如
北京市海淀区
请描述为海淀区
","required": True
}]
}
}]
tool_choice = 'auto'
client = OpenAI(
base_url=api_base,
api_key="empty",
)
chat_completion = client.chat.completions.create(
messages=[{
"role": "user",
"content": "海淀区天气是什么?"
}],
model=model,
tools=tools,
tool_choice=tool_choice
)
`
报错结果:
python /root/modelscope-agent/temp.py
Traceback (most recent call last):
File "/root/modelscope-agent/temp.py", line 29, in
chat_completion = client.chat.completions.create(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 606, in create
return self._post(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
return self._request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1005, in _request
return self._retry_request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1053, in _retry_request
return self._request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1005, in _request
return self._retry_request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1053, in _retry_request
return self._request(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Internal Server Error
server报错如下:
2024-06-15 17:49:22.000 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cca4bd80>
2024-06-15 17:49:22.018 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如
北京市海淀区
请描述为海淀区
", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: 工具返回的结果\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:<generator object Generation.call.. at 0x7fd2cca4bd80>
2024-06-15 17:49:22.019 - modelscope-agent - INFO - | message: call llm 1 times output: <generator object stream_output at 0x7fd2cca4bbc0>
INFO: 127.0.0.1:33044 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 136, in stat_last_call_token_info
'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
for chunk in result:
File "/root/modelscope-agent/modelscope_agent/agents/role_play.py", line 289, in _run
for s in output:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
for trunk in response:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 144, in stat_last_call_token_info
if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
Exception ignored in: <generator object HttpRequest._handle_request at 0x7fd2cca4be60>
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line -1, in run_asgi
RuntimeError: generator ignored GeneratorExit
2024-06-15 17:49:22.928 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cc9b4580>
2024-06-15 17:49:22.934 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如
北京市海淀区
请描述为海淀区
", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: 工具返回的结果\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:<generator object Generation.call.. at 0x7fd2cc9b4580>
2024-06-15 17:49:22.940 - modelscope-agent - INFO - | message: call llm 1 times output: <generator object stream_output at 0x7fd2cc9b4430>
INFO: 127.0.0.1:33052 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 136, in stat_last_call_token_info
'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
for chunk in result:
File "/root/modelscope-agent/modelscope_agent/agents/role_play.py", line 289, in _run
for s in output:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
for trunk in response:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 144, in stat_last_call_token_info
if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
Exception ignored in: <generator object HttpRequest._handle_request at 0x7fd2cc9b4660>
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line -1, in run_asgi
RuntimeError: generator ignored GeneratorExit
2024-06-15 17:49:24.608 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'user', 'content': 'What is the weather like in Boston?'}], 'stop': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location.', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:
<generator object Generation.call.. at 0x7fd2cc9b4120>
2024-06-15 17:49:24.613 - modelscope-agent - INFO - | message: call dashscope generation api | uuid: | details: {'model': 'Qwen2-72B-Instruct', 'messages': [{'role': 'system', 'content': '\n# 工具\n\n## 你拥有如下工具:\n\namap_weather: amap_weather API. amap weather tool Parameters: [{"name": "location", "type": "string", "description": "城市/区具体名称,如
北京市海淀区
请描述为海淀区
", "required": true}] Format the arguments as a JSON object.\n\n## 当你需要调用工具时,请在你的回复中穿插如下的工具调用命令,可以根据需求调用零次或多次:\n\n工具调用\nAction: 工具的名称,必须是[amap_weather]之一\nAction Input: 工具的输入\nObservation: 工具返回的结果\nAnswer: 根据Observation总结本次工具调用返回的结果,如果结果中出现url,请使用如下格式展示出来:\n\n\n# 指令\n\nNone\n\n请注意:你具有图像和视频的展示能力,也具有运行代码的能力,不要在回复中说你做不到。\n'}, {'role': 'user', 'content': '(。你可以使用工具:[amap_weather])海淀区天气是什么?'}], 'stop': ['Observation:', 'Observation:\n'], 'top_p': 0.8, 'result_format': 'message', 'stream': True} | step: | error:<generator object Generation.call.. at 0x7fd2cc9b4120>
2024-06-15 17:49:24.617 - modelscope-agent - INFO - | message: call llm 1 times output: <generator object stream_output at 0x7fd2cc9b4200>
INFO: 127.0.0.1:33060 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 136, in stat_last_call_token_info
'prompt_tokens': response.usage.input_tokens,
AttributeError: 'generator' object has no attribute 'usage'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/root/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 168, in chat_completion
for chunk in result:
File "/root/modelscope-agent/modelscope_agent/agents/role_play.py", line 289, in _run
for s in output:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 20, in stream_output
for trunk in response:
File "/root/modelscope-agent/modelscope_agent/llm/dashscope.py", line 144, in stat_last_call_token_info
if not chunk.usage.get('total_tokens'):
AttributeError: 'NoneType' object has no attribute 'get'
Exception ignored in: <generator object HttpRequest._handle_request at 0x7fd2cc9b4350>
Traceback (most recent call last):
File "/root/.conda/envs/deploy/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line -1, in run_asgi
RuntimeError: generator ignored GeneratorExit
Versions / Dependencies
git clone 最新的版本
Reproduction script
参上
Issue Severity
None
The text was updated successfully, but these errors were encountered: