Skip to content

Fix openai streaming decoding error when using provider webSearch tool#5938

Merged
IMax153 merged 1 commit intoEffect-TS:mainfrom
tensor2077:fix-websearch-stream-decoding
Dec 30, 2025
Merged

Fix openai streaming decoding error when using provider webSearch tool#5938
IMax153 merged 1 commit intoEffect-TS:mainfrom
tensor2077:fix-websearch-stream-decoding

Conversation

@tensor2077
Copy link
Contributor

Type

  • Refactor
  • Feature
  • Bug Fix
  • Optimization
  • Documentation Update

Description

When sending a stream request, OpenAiClient.streamRequest decodes each SSE event.data into ResponseStreamEvent(OpenAiClient.ts:1803).

However, in real streams, ResponseOutputItemAddedEvent can emit a
web_search_call item before it’s fully populated: the action field is not present yet (it only shows up later in response.output_item.done). This can cause an early decode failure while
OpenAiLanguageModel.makeStreamResponse is processing the tool-call stream parts.

export class ResponseOutputItemAddedEvent extends Schema.Class<ResponseOutputItemAddedEvent>(
"@effect/ai-openai/ResponseOutputItemAddedEvent"
)({
/**
* The type of the event. Always `"response.output_item.added"`.
*/
type: Schema.Literal("response.output_item.added"),
/**
* The sequence number for this event.
*/
sequence_number: Schema.Int,
/**
* The index of the output item that was added.
*/
output_index: Schema.Int,
/**
* The output item that was added.
*/
item: Generated.OutputItem
}) {}

Proposed Fix

I currently work around this by relaxing the schema to accept web_search_call without action in ResponseOutputItemAddedEvent, so it can accept an in-progress web_search_call item without action. (status="in_progress").

what do you think is the most appropriate way to model/handle this situations? @IMax153

Reproduce

You can reproduce this by enabling the web_search tool and asking the model to perform a search in streaming mode.

Toolkit.make(
  OpenAI.OpenAiTool.WebSearch({
    search_context_size: 'medium'
  })
)

@tensor2077 tensor2077 requested a review from IMax153 as a code owner December 30, 2025 06:45
@github-project-automation github-project-automation bot moved this to Discussion Ongoing in PR Backlog Dec 30, 2025
@changeset-bot
Copy link

changeset-bot bot commented Dec 30, 2025

🦋 Changeset detected

Latest commit: 213c49f

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@effect/ai-openai Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@tensor2077
Copy link
Contributor Author

A follow-up issue #5939

@IMax153
Copy link
Member

IMax153 commented Dec 30, 2025

@tensor2077 - thank you for the detailed pull request.

It doesn't feel quite right to implement this fix in the OpenAiClient - feels a little bit like a "hack", since if more tools are added in the future that require similar fixes this could become unmaintainable. But I also don't think there's a better short-term fix right now so I'm willing to move this forward.

The good thing is that in Effect v4, our OpenAPI generator will be much more robust, so we can actually generate the streaming output types, which should alleviate the need for these patches.

@IMax153 IMax153 merged commit 72f61be into Effect-TS:main Dec 30, 2025
11 checks passed
@github-project-automation github-project-automation bot moved this from Discussion Ongoing to Done in PR Backlog Dec 30, 2025
@github-actions github-actions bot mentioned this pull request Dec 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

2 participants