FG
๐Ÿค– AI & LLMsOpenAI

Replace node-fetch with undici

Fresh10 months ago
Mar 14, 20260 views
Confidence Score86%
86%

Problem

Confirm this is a feature request for the Node library and not the underlying OpenAI API. - [X] This is a feature request for the Node library Describe the feature or improvement you're requesting I noticed this library is still using node-fetch because Node's native fetch is considered _experimental_. I think it'd be in the libraries best interest to switch to undici instead. Undici is the fetch implementation in Node.js. For all intents and purposes it is _stable_ (https://github.com/nodejs/undici/issues/1737). We (the maintainers of Undici) have some concerns about marking it as such in Node.js just yet because of the nature of the Fetch api spec (it itself adds breaking changes occasionally - this doesn't fit well with Node.js versioning strategy. It's complicated - read the issue I linked above for more details). Switching the undici for the shim will enable a significantly easier upgrade path in the future whenever we figure out how to mark it as properly _stable_ in Node.js Happy to help swap this out too if the maintainers approve ๐Ÿ˜„ ๐Ÿš€ Additional context _No response_

Unverified for your environment

Select your OS to check compatibility.

1 Fix

Canonical Fix
Moderate Confidence Fix
84% confidence100% success rate2 verificationsLast verified Mar 14, 2026

Solution: Replace node-fetch with undici

Low Risk

Interesting, I didn't know that undici offered a different, faster request interface as well. There are a few considerations I'd have: 1. We currently use native fetch on platforms where it's available, partly because node-fetch wouldn't work in non-node envs but partly because some runtimes may desire/expect/require native fetch to be used (for example, I believe Vercel Edge Runtime manipulate

84

Trust Score

2 verifications

100% success
  1. 1

    Interesting, I didn't know that undici offered a different, faster request inter

    Interesting, I didn't know that undici offered a different, faster request interface as well.

  2. 2

    There are a few considerations I'd have:

    There are a few considerations I'd have:

  3. 3

    We currently use native fetch on platforms where it's available, partly because

    2. We accept a user-provided fetch function in client instantiation, and we'd want to continue to do so.

  4. 4

    I'd be a little worried about bifurcating our request pathways between "env-or-u

    I'd be a little worried about bifurcating our request pathways between "env-or-user-provided fetch" or "built-in undici .request et al", but if it's what you'd recommend after considering the above and exploring the codebase, I'm certainly open to it!

Validation

Resolved in openai/openai-node GitHub issue #392. Community reactions: 1 upvotes.

Verification Summary

Worked: 2
Last verified Mar 14, 2026

Sign in to verify this fix

Environment

Submitted by

AC

Alex Chen

2450 rep

Tags

openaigptllmapi