-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
对Qwen1.5以上模型使用vllm加速推理依赖问题 #3974
Comments
对于使用vllm版本0.3.3时遇到的Python包依赖冲突问题,建议您尝试创建一个虚拟环境,并在该环境中安装项目的依赖项,以避免冲突。您可以按照以下步骤操作:
git clone https://github.com/chatchat-space/Langchain-Chatchat.git
cd Langchain-Chatchat
pip install -r requirements.txt
pip install -r requirements_api.txt
pip install -r requirements_webui.txt 请注意,如果您想使用milvus/pg_vector等向量库,请在安装前修改 此外,如果您能提供更多关于具体的Python包依赖冲突细节,我可能能提供更有针对性的帮助。
|
linux环境下,对Qwen1.5以上模型使用vllm加速推理,需要vllm版本在0.3.0以上,我看vllm==0.3.3用的人比较多,有没有vllm==0.3.3对应的requirements.txt文件,目前用vllm==0.3.3会有各种python包依赖冲突
The text was updated successfully, but these errors were encountered: