-
Notifications
You must be signed in to change notification settings - Fork 623
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running Ollama with LLama3 and Phi3 #341
Comments
The problem is not with Ollama, but the capability of the model served with Ollama. The model failed in following the instructions in the prompt to generate a response in the correct format. Phi3 is typically too small to generate a right response. |
No, the problem is with Ollama. I never had a single model work right with Ollama and TaskWeaver (across 4 machines - Win 10/11 and Linux). LM Studio Server works fine. |
Hello,
I wanted to open the issue when using taskweaver with Ollama, run on the local machine, none of the models provided within Ollama are functional.
The way I configured the taskweaver_config.json is:
When I try running Ollama with phi3, the output that I get is:
The same issue occurs when configuring taksweaver with Llama3 in the same way.
I hope to hear back!
Best,
Arhaan
The text was updated successfully, but these errors were encountered: