FG

OpenAI

83 verified issues

๐Ÿค– AI & LLMsOpenAI
Freshabout 20 hours ago

Support for zod 4

Confirm this is a Node library issue and not an underlying OpenAI API issue - [x] This is an issue with the Node library Describe the bug Hey OpenAI team ๐Ÿ‘‹ After updating to Zod v4 Iโ€™m hitting this runtime error: [code block] Looks like the vendored `zod-to-json-schema` in `openai/helpers/zod` still expects the `ZodFirstPartyTypeKind` export, which was removed in Zod v4. Notes / ideas Pinning zod@3 is a temporary workaround, but it blocks upgrading the whole stack. A quick fix might be bumping the vendored zod-to-json-schema to a version that supports Zod v4 or swapping it out for the maintained `@asteasolutions/zod-to-openapi` which already handles the new internals. Thanks for taking a look! Let me know if I can help with more info or a PR. To Reproduce 1. Fresh project with pnpm 1. pnpm add openai@5.8.3 zod@^4 1. Add a simple script: `import { zodResponseFormat } from 'openai/helpers/zod'` 1. pnpm tsx index.ts (or just bundle with esbuild) โ†’ error above. Code snippets [code block] OS macOS Node version Node v22 Library version Latest

Confidence95%
95%
Candidate Fix
2 fixes
๐Ÿค– AI & LLMsOpenAI
Freshover 1 year ago

"lib/IsolatedGPT35TurboMutation/deleteFineTuneModel: AbortController is not defined"

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug What is this error? It works fine locally, but when deploying I get this error..: "lib/IsolatedGPT35TurboMutation/deleteFineTuneModel: AbortController is not defined" To Reproduce if (job.model_id) { if (!src.openAiKey) throw new Error('OpenAiKey not found'); const openai = new OpenAI({ apiKey: src.openAiKey }); const model = await openai.models.del(job.model_id); Code snippets _No response_ OS macOS Node version Node 18 Library version openai 4.0.8

Confidence89%
89%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 1 year ago

Can't build NextJS project with openai library. Getting: Type error: Private identifiers are only available when targeting ECMAScript 2015 and higher.

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug Getting this kind of error on build time on NextJS 14 and I don't know why this is my tsconfig.json [code block] Thanks for help. To Reproduce 1) Install and use library on nextjs 2) import something like `import typoe { Message } from 'openai/resources/beta/threads/messages';` Code snippets _No response_ OS macOS Node version v22.3.0 Library version 4.52.3

Confidence89%
89%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

ChatCompletionStream.fromReadableStream errors due to missing finish_reason for choice

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug When trying to use the API described here https://github.com/openai/openai-node/blob/2242688f14d5ab7dbf312d92a99fa4a7394907dc/examples/stream-to-client-browser.ts I'm getting the following an error at the following point: where the actual choices look like this: Looks like the code expects `finish_reason` to be populated but the finish details are now in a property called `finish_details`? To Reproduce Setup a server that responds with chat completion streams Then in the client try to use the `ChatCompletionStream.fromReadableStream` API, e.g.: [code block] Code snippets _No response_ OS Windows Node version 18.12.1 Library version 4.16.1

Confidence89%
89%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshabout 2 years ago

How to stop streaming

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug I use stream.abort(); to stop receive from api , but i have exception below [code block] I have been follow to guide in the document > If you need to cancel a stream, you can break from a for await loop or call `stream.abort()`. To Reproduce [code block] Nope Code snippets _No response_ OS Ubuntu Node version 16.15.1 Library version v4.28.0

Confidence88%
88%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Fresh11 months ago

Importing & using AssistantStream breaks Angular SSR

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug Just by adding the following lines in my new Angular project, I get an error. Lines: [code block] On compiling the code I get the following error: [code block] Now I'm unsure whether or not this is something you guys can fix, or it should be somehow reported to Angular or the polyfill... but I thought I'd start here, because I really have no clue what's going on. The internals of the AssistantStream are somewhat beyond me and I have no experience with handling streams in Typescript. Feel free to close this issue and report it to the right place - or I can do it with the right info. To Reproduce 1. Install Angular normally with [code block] As part of the installation prompts, choose to use SSR 2. Add this lib [code block] 3. Change the app.component.ts to include the openai AssistantStream [code block] 4. Run angular and watch it crash [code block] Code snippets [code block] OS Linux Mint Node version 20.9.0 and 22.1.0 Library version 4.51.0

Confidence88%
88%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

createTranscription() doesn't work as expected in NodeJS

Describe the bug Hi! I tried to follow the documentation when I was writing a transcription script using NodeJS. And I wanted to get a response in .srt format. But it returns an error. I tried to use the argument `response_format` as well as `responseFormat()`. But that didn't work. Also, there is only one way to communicate with OpenAI API: [code block] But anyway, it doesn't work if I would like to specify the output file format. To Reproduce 1. Run the function (one of them) 2. Get `Required parameter model was null or undefined` error Code snippets [code block] [code block] ``` OS macOS Node version Node v.16.13.0 Library version openai v.3.2.1

Confidence88%
88%
โœ“ Verified Fix Available
1 fixโœ“ 3 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

TS7030: Not all code paths return a value

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug When building using `tsc` with `openai` as a dependency, I get the following error: node_modules/openai/src/lib/AbstractChatCompletionRunner.ts(224,28): error TS7030: Not all code paths return a value. I also had to disable the `noImplicitAny` rule since `openai` is not built, and then my app build inherits your source code basically. To Reproduce `const OpenAI = require('openai');` and build with `tsc` (Ideally, I would like this to be an automated test for this library, so build errors do not repeat.) Code snippets _No response_ OS macOS, ubuntu Node version Node 18 Library version 4.15.1

Confidence88%
88%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

fetch is not exported from 'openai/_shims/fetch'

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug I am getting the following error when trying to instantiate my OpenAI client. [code block] To Reproduce Instantiate openAI with the following: [code block] Code snippets The OpenAI definition in my package.json is `"openai": "^4.2.0"` My `tsconfig.json` file: [code block] ``` OS macOS Node version v20.2.0 Library version 4.2.0

Confidence88%
88%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

Add a way to avoid printing the finalContent when using runFunctions

Confirm this is a feature request for the Node library and not the underlying OpenAI API. - [X] This is a feature request for the Node library Describe the feature or improvement you're requesting I have a routine that prints alphabet A-Z. One function prints one letter at a time. The problem is that OpenAI SDK also outputs the entire alphabet _again_, at the end. Is there a way that would force it to return early without bothering with producing the final output? [code block] Here the last `console.log` will print something along the lines of: [code block] I tried tapping into `.on('finalFunctionCall', ...` event, but that happens after `finalContent()` already has a result. I really just need some sort of event that fires before it starts generating the final output, so I could abort early. The example uses alphabet listing, which is quick. But in real-world scenario, that final content function can a long time/many tokens to generate.

Confidence88%
88%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Fresh11 months ago

Model seems to ignore `.optional()` fields, and instead uses fasly values such as 0, empty strings etc'

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug Model seems to ignore `.optional()` fields, and instead uses nullary values, empty strings etc' I'm not sure if it's a library issue or chatgpt model issue. To Reproduce I suspect this is a KI or an upstream problem, but let me know if not I'll send a repro Code snippets _No response_ OS ubuntu Node version 22.7.0 Library version 4.57.0

Confidence87%
87%
โœ“ Verified Fix Available
1 fixโœ“ 3 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

"warning":"This model version is deprecated. Migrate before January 4, 2024

Describe the bug Im using v3.3.0 attempting to make a call with text-davinci-003 [code block] I get this sort of return: {"warning":"This model version is deprecated. Migrate before January 4, 2024 to avoid disruption of service. Learn more https://platform.openai.com/docs/deprecations"...... I go to my account and try then use - gpt-3.5-turbo-instruct for the model, and promptly get a response of Model does not exist. The information provided in the blog posts, says this is a drop in replacement. I'm kinda stuck, and I don't believe an API call should return a warning as the object and a 200 response. To Reproduce 1. Create a Node application 2. Add the code to call the API 3. Make the call and watch the result Code snippets [code block] ``` OS Windows 11 Node version Node 18.12.1 Library version v3.3.0

Confidence87%
87%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

openai@4.12 do not work on deno runtime?

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug I try to import openai on my netlify edge function, it works well days ago, but today it raises: [code block] Note that I found `/denonext/resources.js` missing from esm.sh. I don't know if this is related. To Reproduce new a netlify edge function with just [code block] and deploys it Code snippets _No response_ OS Windows Node version deno Library version openai v4.12

Confidence87%
87%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

openai-node v3.2.1 createImageEdit() endpoint stopped working

Describe the bug I've been using openai-node v3.1.0 and openai.createImageEdit() worked well. I updated to the v3.2.1 to support the createChatCompletion() method, however my createImageEdit() is not working anymore. I get Error: Request failed with status code 400. If i downgrade to 3.1.0 is starts working back. Has createImageEdit() been changed in the latest update, so my call does not satisfy the new api? will add a code snippet wtih my method which works in v3.1.0 but doesn't in v3.2.1 <img width="567" alt="Screenshot 2023-03-03 at 12 40 49" src="https://user-images.githubusercontent.com/55192345/222686888-ee00c984-2d88-46df-8f8b-ba8bbddd6e65.png"> To Reproduce 1. update from v3.1.0 to v.3.2.1 2. call createImageEdit (implementation provided in snippets section) Code snippets [code block] OS macOS Node version v18.7.0 Library version 3.2.1

Confidence87%
87%
โœ“ Verified Fix Available
1 fixโœ“ 3 verified
๐Ÿค– AI & LLMsOpenAI
Freshalmost 2 years ago

4.40.0 -> 4.40.1: Breaking change - OpenAI is not a constructor

Confirm this is a Node library issue and not an underlying OpenAI API issue - [x] This is an issue with the Node library Describe the bug Based on numbers, 4.40.1 is supposed to be a bug fix release over 4.40.0, so I didn't expect something to break. I have not tested the application after 4.40.1 to see if anything else is broken. const openai = new OpenAI({ ^ TypeError: OpenAI is not a constructor To Reproduce Code saved as test.js: [code block] [code block] Code snippets _No response_ OS WSL Ubuntu 24.04 LTS Node version v20.12.2 Library version 4.40.1

Confidence87%
87%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

Issue with using 'stream' option in TypeScript with CreateChatCompletionResponse

Describe the bug I have been exploring how to use the 'stream' option in TypeScript for discussion #18. While trying to implement it, I found that the CreateChatCompletionResponse does not have a property 'on', which is required to stream the data. To overcome this issue, I converted the CreateChatCompletionResponse to Readable type, and it worked perfectly. However, I believe this is a problem that needs to be addressed as it is not intuitive for TypeScript users. To Reproduce 1. Use the createChatCompletion method to create a chat completion. 2. Add the 'stream' property and set it to 'true' to enable streaming. 3. Add the 'responseType' property to the axios config object and set it to 'stream' to specify the data format. 4. Try to access the 'on' property in the response data, which is not available in TypeScript. Code snippets [code block] OS Windows 11 22621.1413 Node version Node v18.8.0 Library version openai v3.2.1

Confidence87%
87%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

NextJS navigator is not defined - azure v4 stream true

Describe the bug using last v4 server side, there is an error when calling for example chat.create here https://github.com/openai/openai-node/blob/3ec43ee790a2eb6a0ccdd5f25faa23251b0f9b8e/src/core.ts#L806 you can see code: if (!navigator || typeof navigator === 'undefined') { it is checking if it is undefined "after" trying to use it with !navigator To Reproduce calling chat.completions.create with stream true. but using it on Azure from this sample https://github.com/openai/openai-node/blob/3ec43ee790a2eb6a0ccdd5f25faa23251b0f9b8e/examples/azure.ts#L5 Code snippets _No response_ OS macOS Node version node 18 Library version v4.0.0

Confidence86%
86%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

TypeScript compilation errors in backend environment

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug Hello, I recently encountered several TypeScript compilation errors when using this library in a backend project. The issue primarily seems related to missing DOM types. Here are the errors I encountered when I run `tsc` command: [code block] For reference, my tsconfig.json is as follows: [code block] While adding the `dom` library to `lib` of my tsconfig.json does seem to resolve these errors, it causes references in other parts of the code to inadvertently change, leading to build failures. Furthermore, I believe that adding the dom library to a backend codebase isn't appropriate. I've verified that the problem is not related to React's StrictMode. Upon checking the openai library's tsconfig.json, I noticed it uses the skipLibCheck option, but adding that to my project's configuration did not resolve the issue. I wanted to raise this issue here to check if it's a known issue, or if there are any recommended workarounds. Any assistance or guidance would be appreciated. Thank you. To Reproduce 1. yarn add openai 2. run `tsc` on my repository 3. See error Code snippets _No response_ OS macOS Node version Node v18.4.0 Library version openai v4.8.0

Confidence86%
86%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Fresh10 months ago

Can not use OpenAI SDK with Sentry Node agent: TypeError: getDefaultAgent is not a function

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug Referenced previously here, closed without resolution: https://github.com/openai/openai-node/issues/903 This is a pretty big issue as it prevents usage of the SDK while using the latest Sentry monitoring package. To Reproduce 1. Install Sentry Node sdk via `npm i @sentry/node --save` 2. Enter the following code; [code block] 3. Try to create a completion somewhere in the process after Sentry has been initialized: [code block] Results in error: [code block] Code snippets (Included) OS All operating systems (macOS, Linux) Node version v20.10.0 Library version v4.56.0

Confidence86%
86%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

Module not found error on v4.5.0

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug I was following the tutorial to integrate nextjs with openai and noticed that the latest release (v4.5.0) is throwing a module not found error. [code block] Maybe this is something related to this change: Important: I did a downgrade to version v4.4.0 and everything worked as expected. To Reproduce 1. Install release v4.5.0 2. `import OpenAI from "openai";` Code snippets _No response_ OS Debian Node version Node v18.17.1 Library version openai v4.5.0

Confidence86%
86%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Fresh10 months ago

Replace node-fetch with undici

Confirm this is a feature request for the Node library and not the underlying OpenAI API. - [X] This is a feature request for the Node library Describe the feature or improvement you're requesting I noticed this library is still using node-fetch because Node's native fetch is considered _experimental_. I think it'd be in the libraries best interest to switch to undici instead. Undici is the fetch implementation in Node.js. For all intents and purposes it is _stable_ (https://github.com/nodejs/undici/issues/1737). We (the maintainers of Undici) have some concerns about marking it as such in Node.js just yet because of the nature of the Fetch api spec (it itself adds breaking changes occasionally - this doesn't fit well with Node.js versioning strategy. It's complicated - read the issue I linked above for more details). Switching the undici for the shim will enable a significantly easier upgrade path in the future whenever we figure out how to mark it as properly _stable_ in Node.js Happy to help swap this out too if the maintainers approve ๐Ÿ˜„ ๐Ÿš€ Additional context _No response_

Confidence86%
86%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

createChatCompletion seems to ignore the abort signal

Describe the bug Sending an 'abort' signal to the `createChatCompletion` does not raise an error nor stop the completion. It makes me believe that this discussion on the openai community is true https://community.openai.com/t/cancelling-openai-apis-request/99754, but I would like to verify it isn't a bug in this library. To Reproduce Here's my code [code block] Expectation: I should see output like this, and then an error should be raised: [code block] Actual: I see output like this that never stops: [code block] Code snippets _No response_ OS macOS Node version v19.8.1 Library version openai v3.2.1

Confidence86%
86%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Fresh10 months ago

punycode Deprecation Warning via node-fetch Dependency

Describe the bug When using the openai package in my Node.js project, I encounter the following deprecation warning: After investigating, I found that the openai package relies on node-fetch@2.7.0, which in turn depends on whatwg-url@5.0.0. The whatwg-url package is causing the warning as it depends on the deprecated punycode module. To Reproduce 1. Install openai@latest in a Node.js project. 2. Run any script importing the openai package. 3. Observe the deprecation warning related to punycode. Code snippets _No response_ OS Fedora Node version 22.11.0 Library version 4.73.1

Confidence86%
86%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

Im not sure why but I keep getting this error.

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug I keep getting this error, but I have already consulted the documentation to ensure that my code is correct. I have the most up-to-date version of openai installed through npm. To Reproduce I am using expo react native. The dataURI is the path to the audio recording saved in m4a format. Code snippets _No response_ OS pop os linux Node version Node v18.16.0 Library version openai ^4.3.1

Confidence85%
85%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

node_modules/openai/src/streaming.ts(187,18): error TS7006: Parameter 'ctrl' implicitly has an 'any' type.

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug node_modules/openai/src/streaming.ts(187,18): error TS7006: Parameter 'ctrl' implicitly has an 'any' type. To Reproduce "openai": "^4.20.1", Code snippets _No response_ OS macOS Node version rwest@Roshis-MacBook-Pro slackgpt3 % node -v v21.2.0 Library version "openai": "^4.20.1",

Confidence85%
85%
Candidate Fix
1 fixโœ“ 1 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

Stream handling does not recognize stream errors

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug For example, when a chat completion is created like: [code block] And the underlying stream generates an error, this library will not surface the error to the application. Instead, it generates an uncaught exception error. To Reproduce See https://github.com/jsumners-nr/openai-stream-issue/tree/91e2b46b08baec3cd061a02b9492b501a353aab3 Code snippets [code block] OS macOS Node version 18.18.2 Library version 4.20.0

Confidence85%
85%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 1 year ago

batch error when upload file vector in file_search type tool assistant ia

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug when i upload file vector to file_seach type tool assistant, then I follow instruction from doc, i had that error when i process to finish data :`js file-batches.js:99 if (files === null || files.length == 0) { ^ TypeError: Cannot read properties of undefined (reading 'length') ` To Reproduce 1 install 2 create assistant ia tool type file_search 3 upload file for this tools Code snippets _No response_ OS win 10 Node version >=20 Library version >=5.4.4

Confidence84%
84%
Candidate Fix
1 fixโœ“ 2 verified
๐Ÿค– AI & LLMsOpenAI
Fresh5 months ago

zodTextFormat breaks with Zod 4

Confirm this is a Node library issue and not an underlying OpenAI API issue - [x] This is an issue with the Node library Describe the bug As per this issue, `zod-to-json` library, which I believe OpenAI SDK uses under the hood, is not compatible with Zod v4. So e.g. this official tutorial won't work. Also the error is not very descriptive, as we only get: [code block] To Reproduce Install latest zod (4.x.x) and openai [code block] Then this snippet will fail. [code block] then rerun the same code with zod 3 and it should work [code block] [code block]JavaScript ``` OS macOS Node version 22.14.0 Library version openai@5.11.0

Confidence80%
80%
โœ“ Verified Fix Available
1 fixโœ“ 5 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

createTranscription File type does not exist in node

Describe the bug createTranscription's interface is defined as: `createTranscription(file: File, ....)` However, `File` does not exist in node.js. This is a browser only class. What is expected here? A return value from fs.readFileSync? To Reproduce [code block] Code snippets _No response_ OS macOS Node version Node v14.19.0 Library version openai 3.2.1

Confidence80%
80%
โœ“ Verified Fix Available
1 fixโœ“ 4 verified
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

[Whisper] Support OGG file extension

Describe the feature or improvement you're requesting Dear OpenAI Team, I am writing to request the addition of OGG file format support in the Whisper model. As you know, OGG is a popular open-source multimedia container format that is widely used for streaming, storing, and transmitting digital multimedia content such as audio and video. Currently, the Whisper model supports only a limited number of audio file formats, such as WAV and MP3. However, many users, including myself, prefer to use OGG format due to its superior compression, quality, and open-source nature. Therefore, I would like to request that the OpenAI team considers adding OGG file format support to the Whisper model. This would allow users to process and generate high-quality audio content in OGG format, which is important for many applications such as music production, podcasting, and voiceover work. I believe that adding support for OGG file format in the Whisper model would be a valuable addition to the platform, and would help to expand the range of options available to users. Thank you for your consideration, and I look forward to hearing your response. Sincerely, Ido Additional context Specifically `opus codecs`

Confidence80%
80%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshover 1 year ago

Does this library distinguish `429 - Too many requests` from `429 - Too many tokens`? [question]

Describe the bug Sorry, not a bug - just a question! The OpenAI docs stipulate that they enforce rate limits in 2 ways: by request, and by -- source here I'm wondering if this library distinguishes by the two. I don't think it does, because here is an error log I ahve for the `429`: [code block] Upon confirmation from a maintainer that it doesn't, I will open a feature request requesting this differentiation. Thank you! P.S. I'd request a 3rd option for issue submission, a new `Question` one, in addition to the current `Bug` and `Feature request` options. To Reproduce N/A Code snippets [code block] OS mac Ventura (13.0.1) Node version 16.16 Library version 3.0.0

Confidence79%
79%
โœ“ Verified Fix Available
1 fixโœ“ 4 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

I see the project was updated a few hours ago, does this support "gpt-3.5-turbo"

Describe the feature or improvement you're requesting I tried the new "gpt-3.5-turbo" with my previous install and I get an 404 error on return. Has this been updated for the new "gpt-3.5-turbo" model? Additional context _No response_

Confidence79%
79%
โœ“ Verified Fix Available
1 fixโœ“ 6 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

support Microsoft Azure OpenAI service endpoints

Describe the feature or improvement you're requesting Update the API configuration to support Azure openai endpoints as well. In order to use the Python OpenAI library with Microsoft Azure endpoints, we need to set the api_type, api_base and api_version in addition to the api_key. The api_type must be set to 'azure' and the others correspond to the properties of your endpoint. In addition, the deployment name must be passed as the engine parameter. python ========== import openai openai.api_type = "azure" openai.api_key = "..." openai.api_base = "https://example-endpoint.openai.azure.com" openai.api_version = "2022-12-01" create a completion completion = openai.Completion.create(engine="deployment-name", prompt="Hello world") print the completion print(completion.choices[0].text) Additional context _No response_

Confidence79%
79%
โœ“ Verified Fix Available
1 fixโœ“ 6 verified
๐Ÿค– AI & LLMsOpenAI
Freshabout 2 years ago

How to stream Text to Speech?

According to the documentation here for Text to Speech: https://platform.openai.com/docs/guides/text-to-speech?lang=node There is the possibility of streaming audio without waiting for the full file to buffer. But the example is a Python one. Is there any possibility of streaming the incoming audio using Node JS?

Confidence78%
78%
โœ“ Verified Fix Available
1 fixโœ“ 5 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

Mismatch between `createFile(file: File)` and `createReadStream` in docs

Describe the bug The typings were updated such that the signature is `createFile(file: File)`, but the docs example shows a `ReadStream` being provided. `File` is not available in Node. What is meant to be done here? Is this a typo, should be `File | ReadStream`? To Reproduce Try to pass a `ReadStream` to `createFile()`, see type error. Code snippets _No response_ OS N/A Node version latest Library version latest

Confidence78%
78%
โœ“ Verified Fix Available
1 fixโœ“ 4 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

[Whisper] cannot call `createTranscription` function from Node.js due to File API

Describe the bug Cannot call `createTranscription` function like below: [code block] This is because `createTranscription` interface asks me for File API, which is mainly for Browser API. [code block] How can I use this function from Node.js? Thanks! --- [code block] To Reproduce [code block] Code snippets _No response_ OS MacOS Node version Node v18.14.2 Library version openai v3.2.1

Confidence77%
77%
โœ“ Verified Fix Available
1 fixโœ“ 7 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

Library is not compatible with nextjs

Describe the bug I found several issue when i try to build a nextjs application using openai-node : - Typescript compiler option isolatedModules is set to true => Cannot use 'export import' on a type or type-only namespace when 'isolatedModules' is enabled.ts(1269) - use of "#" to define a private property not working To Reproduce create a sample application with : npx create-next-app@latest install openai-node write a simple chatCompletion in a page component Launch "npm run build" You will see errors on "checking validity of types" step. Code snippets _No response_ OS macOs Node version 16.8.x and more Library version 4.00 and more

Confidence76%
76%
โœ“ Verified Fix Available
1 fixโœ“ 3 verified
๐Ÿค– AI & LLMsOpenAI
Freshover 2 years ago

How to use stream: true?

I'm a bit lost as to how to actually use `stream: true` in this library. Example incorrect syntax: [code block]

Confidence75%
75%
โœ“ Verified Fix Available
1 fixโœ“ 6 verified
๐Ÿค– AI & LLMsOpenAI
Fresh22 days ago

OpenAI API 429 rate limit despite low request count โ€” TPM limit reached, not RPM

The OpenAI API returns 429 errors even when request count is well below the documented requests-per-minute (RPM) limit. The actual limit being hit is tokens-per-minute (TPM). A single GPT-4o request can consume 4,000+ tokens, so 50 concurrent requests easily exceeds the TPM budget even though only 50 RPM are used. The error message mentions the RPM limit but the real constraint is TPM.

Confidence72%
72%
โœ“ Verified Fix Available
1 fixโœ“ 7 verified
๐Ÿค– AI & LLMsOpenAI
Fresh5 months ago

LLM response contains markdown code fences that break JSON.parse()

When prompting an LLM to return JSON, the model often wraps the output in markdown code fences (```json ... ```) even when explicitly told not to. Passing this raw response to JSON.parse() throws SyntaxError. This happens with GPT-4o, Claude, and Gemini โ€” any model that has been fine-tuned to be "helpful" tends to add markdown formatting.

Confidence71%
71%
โœ“ Verified Fix Available
1 fixโœ“ 6 verified
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

Invalid file 'image': unsupported mimetype ('application/octet-stream'). Supported file formats are 'image/png'.

Confirm this is a Node library issue and not an underlying OpenAI API issue - [x] This is an issue with the Node library Describe the bug Since yesterday, a call to the "openai.images.edit" API throws an error: `BadRequestError: 400 Invalid file 'image': unsupported mimetype ('application/octet-stream'). Supported file formats are 'image/png'.` The same API call worked for that last year without this error. To Reproduce Provide to images, and set the filenames to patchFn and maskFn, both files are PNGs. Call the openai.images.edit API with the following code Code snippets [code block] OS macOS 15.4 Node version Node v22.14.0 Library version openai 4.95.0

Confidence70%
70%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Fresh5 months ago

pgvector cosine similarity returns irrelevant results for short search queries

Semantic search using pgvector returns irrelevant results when the query is 1โ€“3 words. Short queries produce low-quality embeddings because the model has insufficient context to encode a meaningful semantic direction. A query like 'login error' returns documents about unrelated errors. Hybrid search combining vector similarity with keyword matching (pg_trgm or full-text search) with Reciprocal Rank Fusion significantly improves short query results.

Confidence63%
63%
โœ“ Verified Fix Available
1 fixโœ“ 3 verified
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

Latest release has typescript issues

Describe the bug After the new GPT-3.5 update, building my typescript app now failed. I get errors: [code block] To Reproduce npm i openai@latest npm run build Code snippets _No response_ OS windows/linux Node version 18 Library version 3.2.0

Confidence62%
62%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

chat.completions.create return null on browser

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug While developing an web app on localhost Works like a charm with 3.6 but after upgrading to v4 and adapating the code chat.completions.create return always null with dangerouslyAllowBrowser: true Works well in back with the same code To Reproduce 1. Use demo code in browser 2. Add dangerouslyAllowBrowser to initiator option 3. Run and get error Code snippets _No response_ OS macOS Node version Node v18.16 Library version openai v4

Confidence58%
58%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

Refused to set unsafe header "User-Agent"

Getting this error when trying to run the following code: `Refused to set unsafe header "User-Agent"` [code block]

Confidence57%
57%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

ReferenceError: fetch is not defined in import OpenAI from 'openai'

Hello there! There seems to have been a few issues around this that have been resolved recently, but I'm still getting it so thought I would share just in case it's something different. Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug When building and running the OpenAI library in our NestJS API, we can do so locally with no issue ; we can communicate with it and get responses etc. However, we have a Jest testing suite, and when trying to test, it fails with: ReferenceError: fetch is not defined 1 | import {Injectable} from "@nestjs/common"; 2 | import {ChatCompletionMessageParam} from "openai/resources/chat"; > 3 | import OpenAI from 'openai'; at Object.<anonymous> (../../../../node_modules/openai/_shims/fetch.js:8:17) at Object.<anonymous> (../../../../node_modules/openai/core.js:64:17) at Object.<anonymous> (../../../../node_modules/openai/src/index.ts:116:7) To Reproduce 1. Create a NestJS API 2. Install the openai library via npm 3. Create a controller that returns a response when a message is passed through via POST 4. Create a test suite in Jest to test this endpoint 5. Error occurs Code snippets _No response_ OS macOS Node version Node v18.17.1 Library version openai 4.6.0

Confidence56%
56%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

Is 'functions' parameter not supported?

Describe the bug I'm trying to use the functions parameter to help force JSON response. Is this not supported? To Reproduce Call createChatCompletion with 'functions' array. Code snippets [code block] Results in the error: [code block] ``` OS macOS Node version 18.7.0 Library version 3.3.0

Confidence56%
56%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

createChatCompletion() takes a long time to process.

Describe the bug As described in the title, the method takes a while to load. This is a big problem because, in Vercel, the timeout limit for any call is 5 seconds. And in Netlify, the limit is 10 seconds. But most often the call takes more than 10 seconds to respond. As a result, my website is not working after refactoring the site to use the new gpt-3.5-trubo model. (It works fine with davinci) Basically, my website works on localhost but not when I deploy it to any service. Am I missing something? Is there a way to reduce the time? To Reproduce [code block] This takes more than 10 seconds to complete. Code snippets _No response_ OS Windows 11 Node version v18.12.1 Library version 3.2.1

Confidence56%
56%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

Occasionally getting 429 Too many requests error

Describe the bug I have an app that makes 8 text completion requests to OpenAI using the GPT3.5 turbo in the back-end. The requests are chained together and all together I'm using around 5000 tokens. I have a pay as you go account (older than 48 hours) which according to the documentation about rate limits means that I can make 3500 requests per minute or do 90000 tokens per minute. Now considering the limitations, I believe I don't really hit any rate limits here but somehow I get a 429 Too many requests error back on some occasions. Currently the app doesn't have any active users than me. To Reproduce It happens randomly, so it's hard to say how to reproduce this issue other than chaining multiple createChatCompletions back to back. Code snippets [code block] And it's used like so in the backend: [code block] ``` OS macOs Node version Node v16.14.2 Library version "openai": "^3.2.1",

Confidence56%
56%
Candidate Fix
1 fix
๐Ÿค– AI & LLMsOpenAI
Freshabout 19 hours ago

The `punycode` module is deprecated in Node.js 21 (type = module)

Confirm this is a Node library issue and not an underlying OpenAI API issue - [X] This is an issue with the Node library Describe the bug When importing the `openai` package in Node.js 21 it throws an error: [code block] To Reproduce `package.json`: [code block] `index.js`: [code block] `.env.dev`: [code block] Code snippets _No response_ OS macOS Node version v21.2.0 Library version openai 4.19.1

Confidence56%
56%
Candidate Fix
1 fix