-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python: Bug: Mistral-Large-2411 does not work as exepcted #10586
Comments
Hi @ManojBableshwar, the Can you have a look at using our Given our AI connector, as linked above, it looks like they provide support to call Mistral models as shown here: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-mistral?tabs=mistral-large&pivots=programming-language-python#create-a-client-to-consume-the-model If needed, please help provide more context about where your model is hosted -- for example, it's hosted through Azure AI Foundry? |
Tried this:
but i get the same error... |
@TaoChenOSU thoughts here? |
Hi @ManojBableshwar, thank you for reaching out! TL;DR: Try inserting an assistant message after the tool messages.
|
good catch, this worked... from semantic_kernel.connectors.ai.azure_ai_inference import AzureAIInferenceChatCompletion
chat_completion_service = AzureAIInferenceChatCompletion(
ai_model_id="Mistral-Large-2411",
)
from semantic_kernel import Kernel
kernel = Kernel()
kernel.add_service(chat_completion_service)
from semantic_kernel.connectors.ai.azure_ai_inference import AzureAIInferenceChatPromptExecutionSettings
execution_settings = AzureAIInferenceChatPromptExecutionSettings()
from semantic_kernel.contents.chat_history import ChatHistory
from semantic_kernel.contents import ChatMessageContent, TextContent, ImageContent
from semantic_kernel.contents.utils.author_role import AuthorRole
chat_history = ChatHistory()
from semantic_kernel.contents import ChatMessageContent, FunctionCallContent, FunctionResultContent
# Add a simulated function call from the assistant
chat_history.add_message(
ChatMessageContent(
role=AuthorRole.USER,
items=[
TextContent(text="list all unique allergens"),
]
)
)
chat_history.add_message(
ChatMessageContent(
role=AuthorRole.ASSISTANT,
items=[
FunctionCallContent(
name="get_user_allergies-User",
id="123456789",
arguments=str({"username": "laimonisdumins"})
),
FunctionCallContent(
name="get_user_allergies-User",
id="223456789",
arguments=str({"username": "emavargova"})
)
]
)
)
# Add a simulated function results from the tool role
chat_history.add_message(
ChatMessageContent(
role=AuthorRole.TOOL,
items=[
FunctionResultContent(
name="get_user_allergies-User",
id="123456789",
result="{ \"allergies\": [\"peanuts\", \"gluten\", \"eggs\"] }"
)
]
)
)
chat_history.add_message(
ChatMessageContent(
role=AuthorRole.TOOL,
items=[
FunctionResultContent(
name="get_user_allergies-User",
id="223456789",
result="{ \"allergies\": [\"dairy\", \"gluten\"] }"
)
]
)
)
#chat_history.add_user_message("list unqiue allergens")
response = await chat_completion_service.get_chat_message_content(
chat_history=chat_history,
settings=execution_settings,
)
print(response)
|
Describe the bug
The https://learn.microsoft.com/en-us/semantic-kernel/concepts/ai-services/chat-completion/chat-history?pivots=programming-language-python
I'm trying out the simulating-function-calls section from the above docs. It works with gpt-4o-mini but does not work with Mistral-large-2411
To Reproduce
Expected behavior
If i use
gpt-4o-mini
i correctly get the following response:The unique allergens from the provided lists are:
So the complete list of unique allergens is:
However, if i use
Mistral-Large-2411
, i get the following error:Platform
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: