Replies: 1 comment 5 replies
-
You can already do this by specifying the json structure and just doing the manual implementation of your function inline. That was the default behavior and the Tool Cache is meant to be more advanced feature. |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In my application I need to have a tool function created from a method, which is a local function inside the method. It should not be cached at all and I only need it for the lifetime of the GPT request itself.
I'm currently migrating from an older version and don't understand why all
Function
constructors are enforcing a cache - I don't need it.Is there any way not to use it?
Describe the solution you'd like
I'd be happy if the cache was optional or if I just could have some caching instance, where I can decide myself whether I throw it away or use it for the context of a request and then drop it.
For the convenience of
toolCall.InvokeFunctionAsync
I could imagine creating somevar toolCache = new ToolCache()
and pass it to myChatRequest
. I should be able to also pass thistoolCache
totoolCall.InvokeFunctionAsync
or simply usetoolCache.InvokeFunctionAsync(toolCall, cancellationToken)
.By default, some static
DefaultToolCache
may be used for backwards compatibility. As long as I can override this default behavior and use my own tool cache, this empty instance doesn't hurt performance wise.Describe alternatives you've considered
So far I only see the option to add the function with a random ID. However, I'm also missing a way to remove this specific tool/function from the cache - so I can just clear the whole cache. This also means I'm not able to use
toolCall.InvokeFunctionAsync
safely because the cache may have been cleared somewhere else in my web app meanwhile.Please let me know what you think. Thanks a lot in advance!
Beta Was this translation helpful? Give feedback.
All reactions