Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error "Co-worker mentioned not found..." when using with local llama3 #620

Open
italovieira opened this issue May 15, 2024 · 7 comments · May be fixed by #622
Open

Error "Co-worker mentioned not found..." when using with local llama3 #620

italovieira opened this issue May 15, 2024 · 7 comments · May be fixed by #622

Comments

@italovieira
Copy link
Contributor

italovieira commented May 15, 2024

With the example below I get the following error:

Error executing tool. Co-worker mentioned not found, it must to be one of the following options:
- pilot
from crewai import Agent, Task, Crew

from langchain_community.llms import Ollama

import os

os.environ["OPENAI_API_KEY"] = "NA"


llm = Ollama(model="llama3")


# Agents
luke = Agent(
    role="pilot",
    goal="Destroy the Death Star",
    backstory="The young destined-to-be-Jedi pilot, summoned to attack the Death Star.",
    llm=llm,
)

leia = Agent(
    role="strategist",
    goal="Coordinate the attack on the Death Star",
    backstory="The Rebel leader, essential for strategy and communication.",
    llm=llm,
)

# Tasks
coordinate_attack = Task(
    description="""Leia must coordinate the mission,
    maintaining communication and providing strategic support.
    Leia must ensure that everything is in order, providing a safe path for Luke""",
    expected_output="""Successfully coordinated attack, Death Star destroyed. All units informed and aligned.""",
    agent=leia,
    allow_delegation=True,
)


destroy_death_star = Task(
    description="""Luke must pilot his X-Wing and shoot at the Death Star's weak point to destroy it.""",
    expected_output="""Death Star destroyed, mission successful.""",
    agent=luke,
)


# Crews
rebel_alliance = Crew(
    agents=[leia, luke],
    tasks=[coordinate_attack, destroy_death_star],
    verbose=2,
)


rebel_alliance.kickoff()
@noggynoggy
Copy link
Contributor

There are multiple issues with your code.

  1. manager llm only works with proccess.hierarchical
  2. it is generally recommended to use English for promoting and then just instruct to "translate" the response
  3. When using Ollama use the ollama model class not OpenAI from langchain_community.llms import Ollama

If i havn't missed anything, the coworker not being found is either because points 1-3 or because llama3 is "too dumb"

@noggynoggy
Copy link
Contributor

Actually this might be related to #602

@italovieira
Copy link
Contributor Author

There are multiple issues with your code.

1. manager llm only works with proccess.hierarchical

2. it is generally recommended to use English for promoting and then just instruct to "translate" the response

3. When using Ollama use the ollama model class not OpenAI `from langchain_community.llms import Ollama `

I've updated the code in the description with what you indicated.

For the point 3, I used ChatOpenAI just like the example in the doc https://docs.crewai.com/how-to/LLM-Connections/#ollama-integration-ex-for-using-llama-2-locally, though.

Either way, the error still occurs.

@italovieira
Copy link
Contributor Author

This might be a problem in how the ollama or langchain outputs the steps for the agents.

But I did a bisect and found out crewAI was able to cope with that before 0b78106.
It was only after this commit this error started.

@italovieira italovieira linked a pull request May 15, 2024 that will close this issue
italovieira added a commit to italovieira/crewAI that referenced this issue May 15, 2024
@Yazington
Copy link

Hey @italovieira have you been able to fix it? Getting same issue :(

@italovieira
Copy link
Contributor Author

Hey @italovieira have you been able to fix it? Getting same issue :(

I've opened a MR to fix this issue, but it's not merged yet.

@madmag77
Copy link

Without logs it's hard to figure out the reason. I've had the same problem when LLM (Mistral 0.3 in my case) return action input key as co-worker instead of required coworker and this caused the same misleading error about absent agent. I fix it in this PR. Already tested in my fork - it works fine. Maybe you can try this fix and see the result....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants