You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I connect my Anthropic model via Microsoft.SemanticKernel.Connectors.Amazon to translate these sentences into English, the query processes I wrote as plugins — which normally work — are not receiving any requests.
However, when I perform the same operation with OpenAI services, the model can easily access these plugins. Unfortunately, this is not the case with Anthropic.
I'm experiencing the same issue with the latest o3-mini reasoning model. Could it be that the Amazon Connector is still in an alpha experimental phase, or that the OpenAI o3-mini version is too new, causing the function calling feature not to work properly?
Observations
When I send the same request to the gpt-4o-2024-08-06 model, the breakpoint it hits is:
However, when I send the same request to the anthropic.claude-3-5-sonnet-20240620-v1:0 model, it does not hit the breakpoint.
Implementation Details
The implementation for both clients is almost identical. The only difference is:
For OpenAI services:
I use OpenAIPromptExecutionSettings and the AddOpenAIChatCompletion methods.
For Anthropic services:
I use the base class PromptExecutionSettings because I couldn't find an implemented class specifically for AWS services.
github-actionsbot
changed the title
Bug: .NET Semantic Kernel Function Calling Not Working with Anthropic via Amazon Connector
.Net: Bug: .NET Semantic Kernel Function Calling Not Working with Anthropic via Amazon Connector
Feb 18, 2025
Describe the Bug
When I connect my Anthropic model via
Microsoft.SemanticKernel.Connectors.Amazon
to translate these sentences into English, the query processes I wrote as plugins — which normally work — are not receiving any requests.However, when I perform the same operation with OpenAI services, the model can easily access these plugins. Unfortunately, this is not the case with Anthropic.
I'm experiencing the same issue with the latest
o3-mini
reasoning model. Could it be that the Amazon Connector is still in an alpha experimental phase, or that the OpenAIo3-mini
version is too new, causing the function calling feature not to work properly?Observations
When I send the same request to the
gpt-4o-2024-08-06
model, the breakpoint it hits is:However, when I send the same request to the
anthropic.claude-3-5-sonnet-20240620-v1:0
model, it does not hit the breakpoint.Implementation Details
The implementation for both clients is almost identical. The only difference is:
For OpenAI services:
OpenAIPromptExecutionSettings
and theAddOpenAIChatCompletion
methods.For Anthropic services:
PromptExecutionSettings
because I couldn't find an implemented class specifically for AWS services.Platform Information
Language: C#
NuGet Packages:
Microsoft.SemanticKernel
: 1.37.0Microsoft.SemanticKernel.Connectors.Amazon
: 1.37.0-alphaAWSBedrocRuntime
: 4.0.0-preview.6AI Models:
anthropic.claude-3-5-sonnet-20240620-v1:0
gpt-4o-2024-08-06
IDE: Microsoft Visual Studio Professional 2022 (3) (64-bit) - Current Version 17.13.0
OS: Windows
Possible Causes
Microsoft.SemanticKernel.Connectors.Amazon
package is still in the alpha stage and might not fully support function calling.o3-mini
model from OpenAI is very new, and compatibility issues might exist with the currentfunction-calling
implementation.PromptExecutionSettings
class might be causing the model to bypass the plugin execution logic.Additional Information
I'm happy to provide more details or logs if needed.
Thanks for your time and support!
The text was updated successfully, but these errors were encountered: