Skip to content

.Net: Bug: .Net, Since package version 1.63.0, when using Ollama + gpt-oss:20b, the function calling feature does not work properly in streaming mode. #13027

@bauann

Description

@bauann

Describe the bug
Since package version 1.63.0, when using Ollama + gpt-oss:20b, the function calling feature does not work properly in streaming mode. Instead of being invoked as expected, it directly terminates.

To Reproduce
Steps to reproduce the behavior:

  1. Target the semantick kernel version to 1.63.0
  2. Add a plugin to kernel,
    e.g: builder.Plugins.AddFromType<YOUR_PLUGIN_TYPE>();
  3. Add OllamaChatCompletion with model gpt-oss:20b
  4. Start a async chat (GetStreamingChatMessageContentsAsync) that should trigger the function call added above.
  5. The entire chat session end immediately after a tool call request appears, without executing the tool call.

Expected behavior
The tool should be invoked properly, and the response should be generated based on the results returned from the tool call.

Platform

  • Language: C#
  • Source: nuget package (1.63.0)
  • AI model: ollama 0.11.7 + gpt-oss:20b
  • IDE: Rider
  • OS: Mac

Additional context
I tested both versions 1.63.0 and 1.64.0, and the same issue occurs in both.
However, in versions 1.62.0 and 1.61.0, the tool calls work correctly as expected.

Metadata

Metadata

Assignees

No one assigned

    Labels

    .NETIssue or Pull requests regarding .NET codebugSomething isn't workingtriage

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions