Skip to content

Instantly share code, notes, and snippets.

@tanaikech
Last active September 2, 2022 18:28
Show Gist options
  • Star 9 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save tanaikech/ae451679e8220f3b2d48edb3f8c1a8d3 to your computer and use it in GitHub Desktop.
Save tanaikech/ae451679e8220f3b2d48edb3f8c1a8d3 to your computer and use it in GitHub Desktop.
Simple Script of Resumable Upload with Google Drive API for Node.js

Simple Script of Resumable Upload with Google Drive API for Node.js

This is a simple sample script for achieving the resumable upload to Google Drive using Node.js. In order to achieve the resumable upload, at first, it is required to retrieve the location, which is the endpoint of upload. The location is included in the response headers. After the location was retrieved, the file can be uploaded to the location URL.

In this sample, a PNG file is uploaded with the resumable upload using a single chunk.

Sample script

Before you use this, please set the variables.

const fs = require("fs");
const request = require("request");

const accessToken = "###"; // Please set the access token.
const filename = "./sample.png"; // Please set the filename with the path.

const fileSize = fs.statSync(filename).size;

// 1. Retrieve session for resumable upload.
request(
  {
    method: "POST",
    url:
      "https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable",
    headers: {
      Authorization: `Bearer ${accessToken}`,
      "Content-Type": "application/json"
    },
    body: JSON.stringify({ name: "sample.png", mimeType: "image/png" })
  },
  (err, res) => {
    if (err) {
      console.log(err);
      return;
    }

    // 2. Upload the file.
    request(
      {
        method: "PUT",
        url: res.headers.location,
        headers: { "Content-Range": `bytes 0-${fileSize - 1}/${fileSize}` },
        body: fs.readFileSync(filename)
      },
      (err, res, body) => {
        if (err) {
          console.log(err);
          return;
        }
        console.log(body);
      }
    );
  }
);

Reference

@garvitgoel
Copy link

Hi,
the "fs" library is limited to 2GB. Do you anything that would work for large files (~15 GB)

@jhu7235
Copy link

jhu7235 commented Dec 6, 2021

@garvitgoel I don't think you want to upload 2GB chunks. The goal is to read just a portion of the file at a time and upload that portion. You can also can try streaming uploads.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment