-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama local LLM instead OpenAI issue #1
Comments
Hey! I have a few questions for you to help me understand what's going wrong:
|
Ok, I added the following lines to the .env OPENAI_API_BASE='http://localhost:11434/v1' and seems to work. |
I did more testing and getting this error, did you encounter this? I running crewai 16.3
|
I am testing this crewai example.
I modified using ollama as LLS and did the necessary changes.
I am getting this error:
It seems we encountered an unexpected error while trying to use the tool. This was the error: Error code: 401 - {'error': {'message': 'Incorrect API key provided: DUMMY_KEY. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Btw the code still runs but it repeats this error. So not sure what is happening. Will try to leave it running for a while.
I created a .env with a dummy OpenAI key, but this error persist.
Looks like CrewAI default to OpenAI and does not let me use my local LLM somehow.
I did other examples and look ok. Any idea?
The text was updated successfully, but these errors were encountered: