-
Notifications
You must be signed in to change notification settings - Fork 298
[Bug]: When using Qwen2.5, Swarm cannot work. #1451
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Which LLM provider did you use? |
I am able to run Qwen2.5-72B-Instruct with together.ai client using your script. This is my output.
|
@CAROLZXYZXY I use vllm(v0.8.2) to deploy this model. |
@LianxinGao, are you able to share your LLMConfig/llm config dictionary (with sensitive information replaced). Also, have you tried Ollama (to understand if it's vLLM). |
The code has already been provided above(include LLMConfig). vllm settings:
vllm env:
I have only used vLLM and haven't tried Ollama yet. |
@LianxinGao are you hosting the qwen model with vllm locally? |
Describe the bug
Follow the guide: https://docs.ag2.ai/docs/user-guide/basic-concepts/orchestration/swarm
Steps to reproduce
Code:
Model Used
Qwen2.5-72B-Instruct
Expected Behavior
No response
Screenshots and logs
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: