Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There doesn't seem to be a way to set the context.Terminate in a AIFunction call #5705

Open
gyankov opened this issue Nov 29, 2024 · 5 comments

Comments

@gyankov
Copy link

gyankov commented Nov 29, 2024

From what I see in the source the context is created internally and by default it will continue the calls:

Perhaps it would be better if the return type of the AIFunction is void to set the FunctionInvocationContext.Terminate prop to true by default. Or somehow allow users to set it manually when needed.

@SteveSandersonMS
Copy link
Member

@gyankov Thanks for pointing this out.

Perhaps it would be better if the return type of the AIFunction is void to set the FunctionInvocationContext.Terminate prop to true by default

That's not the semantics that are intended here. Even if an AIFunction is void, the LLM should be allowed to invoke subsequent functions if it deems that applicable (and if other stopping conditions, such as the maximum number of calls, are not met).

The design we had in mind is using the FunctionInvocationContext object. This has a public bool Terminate { get; set; } property that can be set and this will cause the function invocation loop to end. However, this can't currently be used in reality because we don't pass the FunctionInvocationContext object into the AIFunction. As I recall, we were planning to have some way to receive that object in the AIFunction code, e.g. something like:

void MyFunction(FunctionInvocationContext context, string otherParam1, int otherParam2) { ... }

... matching on the object type.

We can treat this issue as tracking the need to complete this functionality. Does this match your expectations, @stephentoub?

@stephentoub
Copy link
Member

stephentoub commented Dec 3, 2024

However, this can't currently be used in reality because we don't pass the FunctionInvocationContext object into the AIFunction

It can be used, but not via AIFunction, as you call out. Right now you need to derive from FunctionInvokingChatClient and override the InvokeFunctionAsync method. That lets you add logic like "if the method being invoked is ABC, then set Terminate = true".

protected override async Task<object?> InvokeFunctionAsync(FunctionInvocationContext context, CancellationToken cancellationToken)
{
    try
    {
        return await base.InvokeFunctionAsync(context, cancellationToken);
    }
    finally
    {
        if (context.Function.Metadata.Name == "ABC") context.Terminate = true;
    }
}

We'd need to think more about a design that would actually funnel that FunctionInvocationContext into the AIFunction itself.

@gyankov
Copy link
Author

gyankov commented Dec 4, 2024

Thank you @SteveSandersonMS and @stephentoub! Overriding the InvokeFunctionAsync works unless I add the UseFunctionInvocation as it always creates a FunctionInvokingChatClient instance:

@stephentoub
Copy link
Member

Overriding the InvokeFunctionAsync works

Great!

unless I add the UseFunctionInvocation as it always creates a FunctionInvokingChatClient instance

Yes, if you're using your own type, you'd need to use the Use method to add it to the pipeline rather than using the UseFunctionInvocation helper.

@gyankov
Copy link
Author

gyankov commented Dec 4, 2024

Got you! Thanks once again @stephentoub.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants