Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

对Qwen1.5以上模型使用vllm加速推理依赖问题 #3974

Open
xqxls opened this issue May 9, 2024 · 1 comment
Open

对Qwen1.5以上模型使用vllm加速推理依赖问题 #3974

xqxls opened this issue May 9, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@xqxls
Copy link

xqxls commented May 9, 2024

linux环境下,对Qwen1.5以上模型使用vllm加速推理,需要vllm版本在0.3.0以上,我看vllm==0.3.3用的人比较多,有没有vllm==0.3.3对应的requirements.txt文件,目前用vllm==0.3.3会有各种python包依赖冲突

@xqxls xqxls added the bug Something isn't working label May 9, 2024
Copy link

dosubot bot commented May 9, 2024

对于使用vllm版本0.3.3时遇到的Python包依赖冲突问题,建议您尝试创建一个虚拟环境,并在该环境中安装项目的依赖项,以避免冲突。您可以按照以下步骤操作:

  1. 克隆Langchain-Chatchat仓库:
git clone https://github.com/chatchat-space/Langchain-Chatchat.git
  1. 进入仓库目录:
cd Langchain-Chatchat
  1. 安装所有依赖项:
pip install -r requirements.txt 
pip install -r requirements_api.txt
pip install -r requirements_webui.txt  

请注意,如果您想使用milvus/pg_vector等向量库,请在安装前修改requirements.txt文件中相应的依赖项。

此外,如果您能提供更多关于具体的Python包依赖冲突细节,我可能能提供更有针对性的帮助。

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant