-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: OpenAIAssistantAgent - doesn't pass complex parameter to PlugIn #10611
Comments
@chesnelg, thanks for reporting this. I was able to observe equivalent behavior for your reported scenario for public sealed class LightModel
{
public string ID { get; init; }
public bool IsOn { get; set; }
}
public sealed class LightPlugin
{
[KernelFunction]
[Description("Gets a list of lights and their current state")]
#pragma warning disable CA1024 // Use properties where appropriate
public IList<LightModel> GetLights()
#pragma warning restore CA1024 // Use properties where appropriate
{
System.Console.WriteLine("\n@ Listing all light state");
return s_lights;
}
[KernelFunction]
public void SwitchLightOn(LightModel light)
{
System.Console.WriteLine($"\n@ Switching on: {light.ID}");
light.IsOn = true;
}
[KernelFunction]
public void SwitchLightOff(LightModel light)
{
System.Console.WriteLine($"\n@ Switching off: {light.ID}");
light.IsOn = false;
}
private static readonly LightModel[] s_lights =
[
new()
{
ID = "Dining Room",
IsOn = false
},
new()
{
ID = "Kitchen",
IsOn = true
},
new()
{
ID = "Living Room",
IsOn = false
},
new()
{
ID = "Bathroom",
IsOn = false
},
new()
{
ID = "Bedroom",
IsOn = true
},
new()
{
ID = "Outside",
IsOn = false
},
new()
{
ID = "Garage",
IsOn = false
},
];
} The user input for this test was:
Please note that the Semantic Kernel does not decide which function to call or the parameter values to pass to that function. This is always specified by the AI model. Semantic Kernel provides the framework to respond to the model direction and orchestrate the execution of the functions the developer defines. Its hard to speculate what may be occuring for you since the code you've provided isn't entirely complete. If you can provide more context, I'd be happy to take another look. |
Sorry, my package reference didn't come through before
Using the PlugIn you have defined, I have built a minimum app.
With the input switch outside light on, I am getting the following log
It is not exactly the behavior I am observing with my own application (which ends up in a loop), however I can see the "Missing Argument" exception, and then a conversion exception before the successful result. I would have expected the AI to detect the parameter before trying to invoke the plugin. |
Can you confirm my understanding that you are using the exact plugin and model I provided in my response? Also, can you please attempt to run this without specifying Also, can you please confirm the model and version you are targeting? |
Yes I am using the exact plugin you have defined above. |
Describe the bug
My Action plugin accept 2 parameters, one a basic string, the other an object. This object is retrieving from a Search plugin.
Using IChatCompletionService, the Kernel is able to retrieve the object using the Search PlugIn and invokes the Action plugin with the 2 parameters, as expected.
However when I use an OpenAIAssistantAgent, the Kernel is also able to retrieve the object using the Search plugIn, but invokes the Action plugin only with the string parameter and fails with the error
Error: Missing argument for function parameter
In my real use case, the kernel loops many time on
To Reproduce
Steps to reproduce the behavior:
Expected behavior
The kernel should be able to identify the required parameter and prompt for a Light object
Platform
Semantic Kernel:
Model: gpt-4o-mini
The text was updated successfully, but these errors were encountered: