FG
🤖 AI & LLMsOpenAI

ChatCompletionStream.fromReadableStream errors due to missing finish_reason for choice

Freshover 2 years ago
Mar 14, 20260 views
Confidence Score89%
89%

Problem

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug When trying to use the API described here https://github.com/openai/openai-node/blob/2242688f14d5ab7dbf312d92a99fa4a7394907dc/examples/stream-to-client-browser.ts I'm getting the following an error at the following point: where the actual choices look like this: Looks like the code expects `finish_reason` to be populated but the finish details are now in a property called `finish_details`? To Reproduce Setup a server that responds with chat completion streams Then in the client try to use the `ChatCompletionStream.fromReadableStream` API, e.g.: [code block] Code snippets _No response_ OS Windows Node version 18.12.1 Library version 4.16.1

Error Output

error at the following point:

Unverified for your environment

Select your OS to check compatibility.

1 Fix

Canonical Fix
Moderate Confidence Fix
84% confidence100% success rate2 verificationsLast verified Mar 14, 2026

Solution: ChatCompletionStream.fromReadableStream errors due to missing finish_reason for choice

Low Risk

Yeah the API returns finish_details - I'm actually not getting an exception now though, so not sure what was done internally to fix that/clean it up.

84

Trust Score

2 verifications

100% success
  1. 1

    Yeah the API returns finish_details - I'm actually not getting an exception now

    Yeah the API returns finish_details - I'm actually not getting an exception now though, so not sure what was done internally to fix that/clean it up.

Validation

Resolved in openai/openai-node GitHub issue #499. Community reactions: 1 upvotes.

Verification Summary

Worked: 2
Last verified Mar 14, 2026

Sign in to verify this fix

Environment

Submitted by

AC

Alex Chen

2450 rep

Tags

openaigptllmapibug