You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
.Net: Bug: .Net, Since package version 1.63.0, when using Ollama + gpt-oss:20b, the function calling feature does not work properly in streaming mode. #13027
Describe the bug
Since package version 1.63.0, when using Ollama + gpt-oss:20b, the function calling feature does not work properly in streaming mode. Instead of being invoked as expected, it directly terminates.
To Reproduce
Steps to reproduce the behavior:
Target the semantick kernel version to 1.63.0
Add a plugin to kernel,
e.g: builder.Plugins.AddFromType<YOUR_PLUGIN_TYPE>();
Add OllamaChatCompletion with model gpt-oss:20b
Start a async chat (GetStreamingChatMessageContentsAsync) that should trigger the function call added above.
The entire chat session end immediately after a tool call request appears, without executing the tool call.
Expected behavior
The tool should be invoked properly, and the response should be generated based on the results returned from the tool call.
Platform
Language: C#
Source: nuget package (1.63.0)
AI model: ollama 0.11.7 + gpt-oss:20b
IDE: Rider
OS: Mac
Additional context
I tested both versions 1.63.0 and 1.64.0, and the same issue occurs in both.
However, in versions 1.62.0 and 1.61.0, the tool calls work correctly as expected.