Skip to content

Instantly share code, notes, and snippets.

@tanaikech
Last active April 5, 2024 17:14
Show Gist options
  • Star 3 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save tanaikech/0e33b7a850e8c56d111ed0f32df0b485 to your computer and use it in GitHub Desktop.
Save tanaikech/0e33b7a850e8c56d111ed0f32df0b485 to your computer and use it in GitHub Desktop.
Simple Script of Resumable Upload with Google Drive API for Axios

Simple Script of Resumable Upload with Google Drive API for Axios

This is a simple sample script for achieving the resumable upload to Google Drive using Axios. In order to achieve the resumable upload, at first, it is required to retrieve the location, which is the endpoint of upload. The location is included in the response headers. After the location was retrieved, the file can be uploaded to the location URL.

In this sample, a text data is uploaded with the resumable upload using a single chunk.

Upload Data using Single Chunk

This sample script achieves the resumable upload using a single chunk.

Sample script

Before you use this, please set the variables.

const axios = require("axios");

const accessToken = "###"; // Please set the access token.
const sampleText = "Hello World"; // Sample text data.
const filename = "sample.txt"; // Please set the filename on Google Drive.

axios({
  method: "POST",
  url: "https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable",
  headers: {
    Authorization: `Bearer ${accessToken}`,
    "Content-Type": "application/json",
  },
  data: JSON.stringify({
    name: filename,
    mimeType: "text/plain",
  }),
}).then(({ headers: { location } }) => {
  const data = Buffer.from(sampleText);
  const fileSize = data.length;
  axios({
    method: "PUT",
    url: location,
    headers: { "Content-Range": `bytes 0-${fileSize - 1}/${fileSize}` },
    data: data,
  }).then(({ data }) => {
    console.log(data);
  });
});

Upload Data using Multiple Chunks

This sample script achieves the resumable upload using the multiple chunks. I answered this sample script to this thread on Stackoverflow.

Flow

  1. Download data from URL.
  2. Create the session for the resumable upload.
  3. Retrieve the downloaded data from the stream and convert it to the buffer.
    • For this, I used stream.Transform.
    • In this case, I stop the stream and upload the data to Google Drive. I couldn't think the method that this can be achieved without stopping the stream.
  4. When the buffer size is the same with the declared chunk size, upload the buffer to Google Drive.
  5. When the upload occurs an error, the same buffer is uploaded again. In this sample script, 3 retries are run. When 3 retries are done, an error occurs.

Sample script

Please set the variables in the function main().

const axios = require("axios");
const stream = require("stream");

function transfer(
  download_url,
  resumable_drive_url,
  file_type,
  file_length,
  accessToken,
  filename,
  chunkSize
) {
  return new Promise((resolve, reject) => {
    axios({
      method: "get",
      url: download_url,
      responseType: "stream",
      maxRedirects: 1,
    })
      .then((result) => {
        const streamTrans = new stream.Transform({
          transform: function (chunk, _, callback) {
            callback(null, chunk);
          },
        });

        // 1. Retrieve session for resumable upload.
        axios({
          method: "POST",
          url: resumable_drive_url,
          headers: {
            Authorization: `Bearer ${accessToken}`,
            "Content-Type": "application/json",
          },
          data: JSON.stringify({
            name: filename,
            mimeType: file_type,
          }),
        })
          .then(({ headers: { location } }) => {
            // 2. Upload the file.
            let startByte = 0;
            result.data.pipe(streamTrans);
            let bufs = [];
            streamTrans.on("data", async (chunk) => {
              bufs.push(chunk);
              const temp = Buffer.concat(bufs);
              if (temp.length >= chunkSize) {
                const dataChunk = temp.slice(0, chunkSize);
                const left = temp.slice(chunkSize);
                streamTrans.pause();
                let upcount = 0;
                const upload = function () {
                  console.log(
                    `Progress: from ${startByte} to ${
                      startByte + dataChunk.length - 1
                    } for ${file_length}`
                  );
                  axios({
                    method: "PUT",
                    url: location,
                    headers: {
                      "Content-Range": `bytes ${startByte}-${
                        startByte + dataChunk.length - 1
                      }/${file_length}`,
                    },
                    data: dataChunk,
                  })
                    .then(({ data }) => resolve(data))
                    .catch((err) => {
                      if (err.response.status == 308) {
                        startByte += dataChunk.length;
                        streamTrans.resume();
                        return;
                      }
                      if (upcount == 3) {
                        reject(err);
                      }
                      upcount++;
                      console.log("Retry");
                      upload();
                      return;
                    });
                };
                upload();
                bufs = [left];
              }
            });
            streamTrans.on("end", () => {
              const dataChunk = Buffer.concat(bufs);
              if (dataChunk.length > 0) {
                // 3. Upload last chunk.
                let upcount = 0;
                const upload = function () {
                  console.log(
                    `Progress(last): from ${startByte} to ${
                      startByte + dataChunk.length - 1
                    } for ${file_length}`
                  );
                  axios({
                    method: "PUT",
                    url: location,
                    headers: {
                      "Content-Range": `bytes ${startByte}-${
                        startByte + dataChunk.length - 1
                      }/${file_length}`,
                    },
                    data: dataChunk,
                  })
                    .then(({ data }) => resolve(data))
                    .catch((err) => {
                      if (upcount == 3) {
                        reject(err);
                      }
                      upcount++;
                      upload();
                      return;
                    });
                };
                upload();
              }
            });
            streamTrans.on("error", (err) => reject(err));
          })
          .catch((err) => reject(err));
      })
      .catch((error) => {
        reject(error);
      });
  });
}

function main() {
  const download_url = "###";
  const resumable_drive_url = "https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable";
  const file_type = "###"; // Please set the mimeType of the downloaded data.
  const file_length = 12345; // Please set the data size of the downloaded data.
  const accessToken = "###"; // Please set the access token.
  const filename = "sample filename"; // Please set the filename on Google Drive.
  const chunkSize = 10485760; // This is used as the chunk size for the resumable upload. This is 10 MB as a sample. In this case, please set the multiples of 256 KB (256 x 1024 bytes).

  transfer(
    download_url,
    resumable_drive_url,
    file_type,
    file_length,
    accessToken,
    filename,
    chunkSize
  )
    .then((res) => console.log(res))
    .catch((err) => console.log(err));
}

main();

Result:

When above script is run for the file size of 23558108 (which is a sample data), the following result is obtained in the console..

Progress: from 0 to 10485759 for 23558108
Progress: from 10485760 to 20971519 for 23558108
Progress(last): from 20971520 to 23558107 for 23558108
{
  kind: 'drive#file',
  id: '###',
  name: 'sample filename',
  mimeType: '###'
}

Reference

@d1dee
Copy link

d1dee commented Oct 6, 2022

I'd suggest changing your initial POST request to
axios({ method: 'POST', url: 'https://www.googleapis.com/upload/drive/v3/files', params: { uploadType: 'resumable' }, headers: { 'Content-Type': 'application/json', Authorization: 'Bearer ${access_token}, }, data: JSON.stringify({ name: filename, mimeType: "text/plain", })

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment