Skip to content

Ollama local LLM instead OpenAI issue #1

@ciaotesla

Description

@ciaotesla

I am testing this crewai example.
I modified using ollama as LLS and did the necessary changes.
I am getting this error:

It seems we encountered an unexpected error while trying to use the tool. This was the error: Error code: 401 - {'error': {'message': 'Incorrect API key provided: DUMMY_KEY. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

Btw the code still runs but it repeats this error. So not sure what is happening. Will try to leave it running for a while.

I created a .env with a dummy OpenAI key, but this error persist.
Looks like CrewAI default to OpenAI and does not let me use my local LLM somehow.

I did other examples and look ok. Any idea?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions