FG
๐Ÿ’ป Software๐Ÿ”Œ APIs & SDKsGoogle

drive API: file upload stream backpressure issue

Fresh3 days ago
Mar 14, 20260 views
Confidence Score61%
61%

Problem

It seems that upload stream backpressure is not handled properly. In the following example, the stream is consumed very quickly (100% reached in 2 seconds) whereas the upload take a while... (that is expected with my ADSL connexion). This mean that the whole file is loaded into memory then sent gradually, which disables the benefit of the streams. Using your example: [code block] Configuration: Node.js v10.10.0 googleapis 33.0.0 OS windows7

Unverified for your environment

Select your OS to check compatibility.

1 Fix

Canonical Fix
Unverified Fix
New Fix โ€“ Awaiting Verification

Implement Stream Backpressure Handling for File Uploads

Medium Risk

The issue arises because the upload stream is not properly managing backpressure, causing the entire file to be loaded into memory before being sent. This occurs when the writable stream does not signal the readable stream to pause when the writable buffer is full, leading to excessive memory usage and delays in upload completion.

Awaiting Verification

Be the first to verify this fix

  1. 1

    Modify Stream Handling

    Adjust the upload stream to handle backpressure by utilizing the 'highWaterMark' option and implementing a pause/resume mechanism based on the writable stream's state.

    javascript
    const { google } = require('googleapis');
    const fs = require('fs');
    
    const drive = google.drive({ version: 'v3', auth });
    const fileMetadata = { name: 'file.txt' };
    const media = {
      mimeType: 'text/plain',
      body: fs.createReadStream('file.txt', { highWaterMark: 16 * 1024 }) // 16KB chunks
    };
    
    const uploadFile = async () => {
      const response = await drive.files.create({
        resource: fileMetadata,
        media: media,
        fields: 'id'
      });
      console.log('File Id:', response.data.id);
    };
    
    uploadFile();
  2. 2

    Set High Water Mark

    Configure the 'highWaterMark' option in the stream to control the buffer size, which helps manage backpressure effectively. This allows the writable stream to signal when it is ready to accept more data.

    javascript
    const stream = fs.createReadStream('file.txt', { highWaterMark: 16 * 1024 }); // Adjust buffer size as needed
  3. 3

    Implement Error Handling

    Add error handling to the stream to catch any issues during the upload process, ensuring that the application can gracefully handle failures without crashing.

    javascript
    media.body.on('error', (err) => {
      console.error('Upload error:', err);
    });
  4. 4

    Test Upload with Large Files

    Run tests with large files to ensure that the backpressure handling is functioning correctly and that memory usage remains within acceptable limits during uploads.

    javascript
    const largeFileStream = fs.createReadStream('largeFile.txt');
    largeFileStream.on('data', (chunk) => {
      console.log('Chunk size:', chunk.length);
    });

Validation

Confirm the fix by uploading a large file and monitoring memory usage. The upload should proceed without loading the entire file into memory at once, and the process should complete within a reasonable time frame based on your connection speed.

Sign in to verify this fix

Environment

Submitted by

AC

Alex Chen

2450 rep

Tags

google-apioauthsdktype:-bugpriority:-p2