-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: SK runs his self-made function by ignoring the auto-invoke function config #281
Comments
Which model are you using? |
I'm using GPT-4o as the model. |
All method metadata and user intents are assembled in the prompt(in userInput and metadata variables). |
I have run a sample based on the original code, and it appears to execute as expected: Are you able to share more context/a failing sample of the code that is failing? |
@johnoliver Sorry for some mistake description before. My prompt is:
My metadata looks like:
|
So from my understanding you just want the JSON back, but not the function invocation, in this case I think passing:
is the mistake, passing true instructs the kernel to perform automatic function invocations. However even if you pass false here, the LLM is likely to still request a tool invocation, it will just be left to the user to make that call manually. I think for your usecase you might not want to use function calling at all, since the LLM will try to use functions as part of its execution, whereas I believe you simply want the json returned. I would enable json structured outputs and not pass any tool call behaviour at all, and see if that works as you need. |
Because I also need function-call to obtain some contextual information. For example, constructing parameters based on time descriptions—like transforming "one day ago" into start_time and end_time using methods such as getCurrentDate, as well as some other utility functions. |
So I think this is probably more of a prompt/model issue than a semantic kernel issue, I think the fact that you are asking the LLM to invoke some functions but not the final function that is possibly too complex a concept for it to understand reliably. What I am somewhat surprised about (if I am understanding the situation correctly) is that the LLM hallucinated and tried to invoke a function that was not passed to it as an available function. As it effectively invented that A second thing to try, in your few shot examples, try renaming the function to remove the "$", again from my interpretation the issue is that with the $ it is not a valid function call. If the sample is a valid function name, what I am hoping will happen is:
|
Describe the bug
I set the toolcallbehavior of
allowOnlyKernelFunctions
astrue
, and passed the functions to the kernelbut the kernel make a functiontoolcall by himself and just invoke it.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Only invoke the passed functions.
Screenshots
Maven
Platform
Additional context
Excption: com.azure.core.exception.HttpResponseException: Status code 400, "{"error":{"message":"Invalid 'messages[1].tool_calls[1].function.name': string does not match pattern. Expected a string that matches the pattern '^[a-zA-Z0-9_-]+$'.","type":"invalid_request_error","code":"invalid_value","param":"messages[1].tool_calls[1].function.name"}}"
The text was updated successfully, but these errors were encountered: