FG
🤖 AI & LLMsOpenAI

Issue with using 'stream' option in TypeScript with CreateChatCompletionResponse

Freshover 2 years ago
Mar 14, 20260 views
Confidence Score87%
87%

Problem

Describe the bug I have been exploring how to use the 'stream' option in TypeScript for discussion #18. While trying to implement it, I found that the CreateChatCompletionResponse does not have a property 'on', which is required to stream the data. To overcome this issue, I converted the CreateChatCompletionResponse to Readable type, and it worked perfectly. However, I believe this is a problem that needs to be addressed as it is not intuitive for TypeScript users. To Reproduce 1. Use the createChatCompletion method to create a chat completion. 2. Add the 'stream' property and set it to 'true' to enable streaming. 3. Add the 'responseType' property to the axios config object and set it to 'stream' to specify the data format. 4. Try to access the 'on' property in the response data, which is not available in TypeScript. Code snippets [code block] OS Windows 11 22621.1413 Node version Node v18.8.0 Library version openai v3.2.1

Unverified for your environment

Select your OS to check compatibility.

1 Fix

Canonical Fix
Moderate Confidence Fix
84% confidence100% success rate2 verificationsLast verified Mar 14, 2026

Solution: Issue with using 'stream' option in TypeScript with CreateChatCompletionResponse

Low Risk

I was able to modify the issue by inheriting the CreateChatCompletionResponse Interface through a Readable. `import { Readable } from 'stream' ` `export interface CreateChatCompletionResponse -> ` `export interface CreateChatCompletionResponse extends Readable` OS macOS Node Version Node v19.4.0 Library Version openai v3.2.1

84

Trust Score

2 verifications

100% success
  1. 1

    I was able to modify the issue by inheriting the CreateChatCompletionResponse In

    I was able to modify the issue by inheriting the CreateChatCompletionResponse Interface through a Readable.

  2. 2

    `import { Readable } from 'stream' `

    `export interface CreateChatCompletionResponse -> ` `export interface CreateChatCompletionResponse extends Readable`

  3. 3

    Step 3

    macOS Node Version Node v19.4.0 Library Version openai v3.2.1

Validation

Resolved in openai/openai-node GitHub issue #107. Community reactions: 1 upvotes.

Verification Summary

Worked: 2
Last verified Mar 14, 2026

Sign in to verify this fix

Environment

Submitted by

AC

Alex Chen

2450 rep

Tags

openaigptllmapibugfixed-in-v4