Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: OpenAIAssistantAgent - doesn't pass complex parameter to PlugIn #10611

Open
chesnelg opened this issue Feb 19, 2025 · 4 comments
Open

Bug: OpenAIAssistantAgent - doesn't pass complex parameter to PlugIn #10611

chesnelg opened this issue Feb 19, 2025 · 4 comments
Assignees
Labels
agents bug Something isn't working follow up Issues that require a follow up from the community.

Comments

@chesnelg
Copy link

Describe the bug
My Action plugin accept 2 parameters, one a basic string, the other an object. This object is retrieving from a Search plugin.
Using IChatCompletionService, the Kernel is able to retrieve the object using the Search PlugIn and invokes the Action plugin with the 2 parameters, as expected.
However when I use an OpenAIAssistantAgent, the Kernel is also able to retrieve the object using the Search plugIn, but invokes the Action plugin only with the string parameter and fails with the error Error: Missing argument for function parameter
In my real use case, the kernel loops many time on

To Reproduce
Steps to reproduce the behavior:

  1. Define plugIns (using the LightModel from Semantic Kernel examples)
   [Description("Gets a list of lights and their current state")]
   public async Task<List<LightModel>> GetLightsAsync()
   {
      return lights;
   }

   [KernelFunction("switch_on_light")]
   [Description("Switch a light on")]
   public async Task SwitchOn(LightModel light)
   {
      light.IsOn = true;
   }
  1. Add plugin to Kernel.
  2. Create OpenAIAssistantAgent, passing the kernel
  3. Add input "switch on light"
  4. Getting Error:
       ---> System.ArgumentException: Missing argument for function parameter (Parameter 'light')
         --- End of inner exception stack trace ---
         at Microsoft.SemanticKernel.KernelFunctionFromMethod.<>c__DisplayClass23_0.<GetParameterMarshalerDelegate>g__parameterFunc|9(KernelFunction _, Kernel kernel, KernelArguments arguments, CancellationToken __)

Expected behavior
The kernel should be able to identify the required parameter and prompt for a Light object

Platform
Semantic Kernel:
Model: gpt-4o-mini

@chesnelg chesnelg added the bug Something isn't working label Feb 19, 2025
@crickman
Copy link
Contributor

crickman commented Feb 20, 2025

@chesnelg, thanks for reporting this.

I was able to observe equivalent behavior for your reported scenario for ChatCompletionAgent and OpenAIAssistantAgent using the following plugin:

public sealed class LightModel
{
    public string ID { get; init; }
    public bool IsOn { get; set; }
}

public sealed class LightPlugin
{
    [KernelFunction]
    [Description("Gets a list of lights and their current state")]
#pragma warning disable CA1024 // Use properties where appropriate
    public IList<LightModel> GetLights()
#pragma warning restore CA1024 // Use properties where appropriate
    {
        System.Console.WriteLine("\n@ Listing all light state");
        return s_lights;
    }

    [KernelFunction]
    public void SwitchLightOn(LightModel light)
    {
        System.Console.WriteLine($"\n@ Switching on: {light.ID}");
        light.IsOn = true;
    }

    [KernelFunction]
    public void SwitchLightOff(LightModel light)
    {
        System.Console.WriteLine($"\n@ Switching off: {light.ID}");
        light.IsOn = false;
    }

    private static readonly LightModel[] s_lights =
        [
            new()
            {
                ID = "Dining Room",
                IsOn = false
            },
            new()
            {
                ID = "Kitchen",
                IsOn = true
            },
            new()
            {
                ID = "Living Room",
                IsOn = false
            },
            new()
            {
                ID = "Bathroom",
                IsOn = false
            },
            new()
            {
                ID = "Bedroom",
                IsOn = true
            },
            new()
            {
                ID = "Outside",
                IsOn = false
            },
            new()
            {
                ID = "Garage",
                IsOn = false
            },
        ];
}

The user input for this test was:

  • "Did I leave the bathroom light on?"
  • "Did I leave the bedroom light on?"
  • "Can you please turn it off?"
  • "Which lights are currently on?"
  • "Didn't you turn bedroom off?"
  • "Turn on the outside lights"
  • "Which lights are currently on?"

Please note that the Semantic Kernel does not decide which function to call or the parameter values to pass to that function. This is always specified by the AI model. Semantic Kernel provides the framework to respond to the model direction and orchestrate the execution of the functions the developer defines.

Its hard to speculate what may be occuring for you since the code you've provided isn't entirely complete. If you can provide more context, I'd be happy to take another look.

@crickman crickman added the follow up Issues that require a follow up from the community. label Feb 20, 2025
@chesnelg
Copy link
Author

Sorry, my package reference didn't come through before

    <PackageReference Include="Microsoft.SemanticKernel" Version="1.37.0-alpha" />
    <PackageReference Include="Microsoft.SemanticKernel.Agents.Core" Version="1.37.0-alpha" />
    <PackageReference Include="Microsoft.SemanticKernel.Agents.OpenAI" Version="1.37.0-alpha" />    
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.AzureAISearch" Version="1.37.0-preview" />
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.AzureOpenAI" Version="1.37.0-alpha" />
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.InMemory" Version="1.37.0-preview" />
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.OpenAI" Version="1.37.0" />
    <PackageReference Include="Microsoft.SemanticKernel.Process.LocalRuntime" Version="1.37.0-alpha" />

Using the PlugIn you have defined, I have built a minimum app.

using Azure;
using Microsoft.SemanticKernel;
using CSharp;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Agents.OpenAI;
#pragma warning disable SKEXP0010
#pragma warning disable SKEXP0120
#pragma warning disable AOAI001
#pragma warning disable SKEXP0110
#pragma warning disable SKEXP0001

public class Test
{
    private Configuration _config = new Configuration();
    public async Task Run()
    {
        var kernelBuilder = Kernel.CreateBuilder();

        kernelBuilder.AddAzureOpenAIChatCompletion(
            deploymentName: _config.OaiDeploymentName,
            endpoint: _config.OaiEndpoint,
            apiKey: _config.OaiApiKey);

        kernelBuilder.Services.AddLogging(loggingBuilder => loggingBuilder.AddConsole().SetMinimumLevel(LogLevel.Trace));
        kernelBuilder.Plugins.AddFromType<LightPlugin>();
        var kernel = kernelBuilder.Build();

        var systemMessage = "You are a helpful assistant.";
        var instructions = "You can switch lights on and off";

        var chatHistory = new ChatHistory();
        chatHistory.AddSystemMessage(systemMessage + instructions);

        var chatAiAgent = await OpenAIAssistantAgent.CreateAsync(
            OpenAIClientProvider.ForAzureOpenAI(new AzureKeyCredential(_config.OaiApiKey), new Uri(_config.OaiEndpoint)),
            new OpenAIAssistantDefinition(_config.OaiDeploymentName){ 
                Name = "EbAgent",
                Instructions = systemMessage + instructions, 
                Temperature = (float?).5, TopP = (float?)0.5,
        },kernel);
        
        
        var chatAiAgentThreadId = await chatAiAgent.CreateThreadAsync(options: new OpenAIThreadCreationOptions(){ Messages = chatHistory});

        while (true)
        {
            // Get the prompt text
            Console.WriteLine("User:");
            string text = Console.ReadLine() ?? "";
            if (text == "exit") break;

            await chatAiAgent.AddChatMessageAsync(
                chatAiAgentThreadId, 
                new Microsoft.SemanticKernel.ChatMessageContent(AuthorRole.User, text));

            var responseText = "";
            await foreach(var responseItem in chatAiAgent.InvokeAsync(chatAiAgentThreadId))
                        responseText += responseItem.Items[0].ToString() ?? "";
            Console.WriteLine("Agent: " + responseText);

        }
        await chatAiAgent.DeleteThreadAsync(chatAiAgentThreadId);
        await chatAiAgent.DeleteAsync();
    }
}

With the input switch outside light on, I am getting the following log

info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-GetLights invoking.
trce: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-GetLights arguments: {}

@ Listing all light state
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-GetLights succeeded.
trce: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-GetLights result: [{"ID":"Dining Room","IsOn":false},{"ID":"Kitchen","IsOn":true},{"ID":"Living Room","IsOn":false},{"ID":"Bathroom","IsOn":false},{"ID":"Bedroom","IsOn":true},{"ID":"Outside","IsOn":false},{"ID":"Garage","IsOn":false}]      
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-GetLights completed. Duration: 0.0268712s
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn invoking.
trce: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn arguments: {}
fail: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn failed. Error: Missing argument for function parameter 'light'
      Microsoft.SemanticKernel.KernelException: Missing argument for function parameter 'light'
       ---> System.ArgumentException: Missing argument for function parameter (Parameter 'light')
         --- End of inner exception stack trace ---
         at Microsoft.SemanticKernel.KernelFunctionFromMethod.<>c__DisplayClass23_0.<GetParameterMarshalerDelegate>g__parameterFunc|9(KernelFunction _, Kernel kernel, KernelArguments arguments, CancellationToken __)
         at Microsoft.SemanticKernel.KernelFunctionFromMethod.<>c__DisplayClass21_0.<GetMethodDetails>g__Function|0(Kernel kernel, KernelFunction function, KernelArguments arguments, CancellationToken cancellationToken)
         at Microsoft.SemanticKernel.KernelFunctionFromMethod.InvokeCoreAsync(Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)
         at Microsoft.SemanticKernel.KernelFunction.<>c__DisplayClass27_0.<<InvokeAsync>b__0>d.MoveNext()
      --- End of stack trace from previous location ---
         at Microsoft.SemanticKernel.Kernel.InvokeFilterOrFunctionAsync(NonNullCollection`1 functionFilters, Func`2 functionCallback, FunctionInvocationContext context, Int32 index)
         at Microsoft.SemanticKernel.Kernel.OnFunctionInvocationAsync(KernelFunction function, KernelArguments arguments, FunctionResult functionResult, Boolean isStreaming, Func`2 functionCallback, CancellationToken cancellationToken)
         at Microsoft.SemanticKernel.KernelFunction.InvokeAsync(Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn completed. Duration: 0.0257028s
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn invoking.
trce: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn arguments: {"light":"Outside"}
fail: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn failed. Error: Object of type 'System.String' cannot be converted to type 'LightModel'.      
      System.ArgumentException: Object of type 'System.String' cannot be converted to type 'LightModel'.
         at System.RuntimeType.CheckValue(Object& value, ParameterCopyBackAction& copyBack, Binder binder, CultureInfo culture, BindingFlags invokeAttr)
         at System.Reflection.MethodBase.CheckArguments(Span`1 copyOfParameters, IntPtr* byrefParameters, Span`1 shouldCopyBack, ReadOnlySpan`1 parameters, RuntimeType[] sigTypes, Binder binder, CultureInfo culture, BindingFlags invokeAttr)
         at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
         at Microsoft.SemanticKernel.KernelFunctionFromMethod.Invoke(MethodInfo method, Object target, Object[] arguments)
         at Microsoft.SemanticKernel.KernelFunctionFromMethod.<>c__DisplayClass21_0.<GetMethodDetails>g__Function|0(Kernel kernel, KernelFunction function, KernelArguments arguments, CancellationToken cancellationToken)
         at Microsoft.SemanticKernel.KernelFunctionFromMethod.InvokeCoreAsync(Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)
         at Microsoft.SemanticKernel.KernelFunction.<>c__DisplayClass27_0.<<InvokeAsync>b__0>d.MoveNext()
      --- End of stack trace from previous location ---
         at Microsoft.SemanticKernel.Kernel.InvokeFilterOrFunctionAsync(NonNullCollection`1 functionFilters, Func`2 functionCallback, FunctionInvocationContext context, Int32 index)
         at Microsoft.SemanticKernel.Kernel.OnFunctionInvocationAsync(KernelFunction function, KernelArguments arguments, FunctionResult functionResult, Boolean isStreaming, Func`2 functionCallback, CancellationToken cancellationToken)
         at Microsoft.SemanticKernel.KernelFunction.InvokeAsync(Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn completed. Duration: 0.0120476s
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn invoking.
trce: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn arguments: {"light":"{\u0022ID\u0022:\u0022Outside\u0022,\u0022IsOn\u0022:false}"}

@ Switching on: Outside
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn succeeded.
trce: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn result:
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function LightPlugin-SwitchLightOn completed. Duration: 0.0074427s
Agent: The outside light has been successfully switched on.

It is not exactly the behavior I am observing with my own application (which ends up in a loop), however I can see the "Missing Argument" exception, and then a conversion exception before the successful result. I would have expected the AI to detect the parameter before trying to invoke the plugin.

@crickman
Copy link
Contributor

Can you confirm my understanding that you are using the exact plugin and model I provided in my response?

Also, can you please attempt to run this without specifying Temperature or TopP? I seem to remember the docs recommending to modify one or the other, but not both. I'm just curious on a comparision using the defaults.

Also, can you please confirm the model and version you are targeting?

@chesnelg
Copy link
Author

Yes I am using the exact plugin you have defined above.
I did further tests, commenting out the temperature and TopT parameters separately, using gpt-35-turbo-16k and gpt-4o-mini. Same result...
As mentioned before, it is working fine when I use the IChatCompletionService.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
agents bug Something isn't working follow up Issues that require a follow up from the community.
Projects
Status: No status
Development

No branches or pull requests

3 participants