-
Notifications
You must be signed in to change notification settings - Fork 457
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
加载BELLE和Vicuna模型后,提问回答报错? #97
Labels
bug
Something isn't working
Comments
根据报错,似乎是Vicuna-7b模型不支持.generate()方法,可以的话请换一个例如chat-gpt 7B模型再尝试一下 |
好的,我已收到!
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
可以正常加载chatglm-6B-int8并且正常问答,但是加载BELLE-7b和Vicuna-7b模型后,进行提问,页面出现ERROR,同时后台报错如下信息:
TypeError: The current model class (LlamaModel) is not compatible with
.generate()
, as it doesn't have a language model head. Please use one of the following classes instead: {'LlamaForCausalLM'}代码断点定位在KnowledgeBasedChatLLM类的get_knowledge_based_answer函数的这一句上
result = knowledge_chain({"query": query})
The text was updated successfully, but these errors were encountered: