Skip to content

Instantly share code, notes, and snippets.

@alexismp
Last active November 8, 2022 21:58
Show Gist options
  • Save alexismp/86315a74cdf887003b11f4ceade924de to your computer and use it in GitHub Desktop.
Save alexismp/86315a74cdf887003b11f4ceade924de to your computer and use it in GitHub Desktop.
Node.js 8 Cloud Function to write to a Google Sheets document
// Copyright 2018 Google LLC.
// SPDX-License-Identifier: Apache-2.0
const { google } = require("googleapis");
const { Storage } = require("@google-cloud/storage");
exports.csv2sheet = async (data, context) => {
var fileName = data.name;
// basic check that this is a *.csv file, etc...
if (!fileName.endsWith(".csv")) {
console.log("Not a .csv file, ignoring.");
return;
}
// define name of new sheet
const sheetName = fileName.slice(0, -4);
// block on auth + getting the sheets API object
const auth = await google.auth.getClient({
scopes: [
"https://www.googleapis.com/auth/spreadsheets",
"https://www.googleapis.com/auth/devstorage.read_only"
]
});
const sheetsAPI = google.sheets({ version: "v4", auth });
// create a new sheet and remember its ID (based on the filename, removing the .csv extension)
const sheetId = await addEmptySheet(sheetsAPI, sheetName);
const theData = await readCSVContent(sheetsAPI, data, sheetName);
await populateAndStyle(sheetsAPI, theData, sheetId);
};
// read data from the CSV file uploaded to the storage bucket
// and returns a string of CSV values with carriage returns
function readCSVContent(sheetsAPI, file, sheetName) {
return new Promise((resolve, reject) => {
const storage = new Storage();
let fileContents = new Buffer('');
storage.bucket(file.bucket).file(file.name).createReadStream()
.on('error', function(err) {
reject('The Storage API returned an error: ' + err);
})
.on('data', function(chunk) {
fileContents = Buffer.concat([fileContents, chunk]);
})
.on('end', function() {
let content = fileContents.toString('utf8');
console.log("CSV content read as string : " + content );
resolve(content);
});
});
}
// Creates a new sheet in the spreadsheet with the given name at position 2,
// with 26 colums and 2000 rows with the first row frozen.
// Returns its sheetId
function addEmptySheet(sheetsAPI, sheetName) {
return new Promise((resolve, reject) => {
const addEmptySheetParams = {
// reading SHEET_ID from function environment variable
spreadsheetId: process.env.SPREADSHEET_ID,
resource: {
requests: [
{
addSheet: {
properties: {
title: sheetName,
index: 1,
gridProperties: {
rowCount: 2000,
columnCount: 26,
frozenRowCount: 1
}
}
}
}
]
}
};
sheetsAPI.spreadsheets.batchUpdate(addEmptySheetParams, function(
err,
response
) {
if (err) {
reject("The Sheets API returned an error: " + err);
} else {
const sheetId = response.data.replies[0].addSheet.properties.sheetId;
console.log("Created empty sheet: " + sheetId);
resolve(sheetId);
}
});
});
}
function populateAndStyle(sheetsAPI, theData, sheetId) {
return new Promise((resolve, reject) => {
// Using 'batchUpdate' allows for multiple 'requests' to be sent in a single batch.
// Populate the sheet referenced by its ID with the data received (a CSV string)
// Style: set first row font size to 11 and to Bold. Exercise left for the reader: resize columns
const dataAndStyle = {
spreadsheetId: process.env.SPREADSHEET_ID,
resource: {
requests: [
{
pasteData: {
coordinate: {
sheetId: sheetId,
rowIndex: 0,
columnIndex: 0
},
data: theData,
delimiter: ","
}
},
{
repeatCell: {
range: {
sheetId: sheetId,
startRowIndex: 0,
endRowIndex: 1
},
cell: {
userEnteredFormat: {
textFormat: {
fontSize: 11,
bold: true
}
}
},
fields: "userEnteredFormat(textFormat)"
}
}
]
}
};
sheetsAPI.spreadsheets.batchUpdate(dataAndStyle, function(err, response) {
if (err) {
reject("The Sheets API returned an error: " + err);
} else {
console.log(sheetId + " sheet populated with " + theData.length + " rows and column style set.");
resolve();
}
});
});
}
@QPeiran
Copy link

QPeiran commented Jan 7, 2020

it works fine for me, no need to change anything in the snippet @cbuffone

@MGCataldo
Copy link

MGCataldo commented Apr 27, 2020

Thank you alexismp for the great Job!

Everything works like a charm, I'd suggest just some (honestly irrelevant) corrections to the code.
In details.

  • Variable sheetsAPI and sheetName are global for the script, there is no need to pass them in to the functions.
  • Function addEmptySheet does not requires sheetsAPI variable, avoid passing in.
  • Function readCSVContent does not requires sheetsAPI and sheetName variables for the execution, avoid passing in.
  • Function populateAndStyle no need to pass sheetsAPI variable, it's global...

Function declarations become

  • function addEmptySheet(sheetName)
  • function readCSVContent(file)
  • function populateAndStyle(theData, sheetId)

Final blocking calls become

  const sheetId = await addEmptySheet(sheetName);
  const theData = await readCSVContent(data);
  await populateAndStyle(theData, sheetId);

The whole code become:

const {google} = require("googleapis");
const {Storage} = require("@google-cloud/storage");

exports.csv2sheet = async (data,context) => {
  var filename = data.name;
  if (!filename.endsWith(".csv")){
    console.log("Not a .csv file, ignoring.");
    return;
  }

  // define name of new sheet
  const sheetName = filename.slice(0,-4);

  // block on auth + getting the sheets API object
  const auth = await google.auth.getClient({
    scopes: [
      "https://www.googleapis.com/auth/spreadsheets",
      "https://www.googleapis.com/auth/devstorage.read_only"
    ]
  });
  
  const sheetsAPI = google.sheets({version: 'v4',auth});

  function addEmptySheet(sheetName) {
    return new Promise((resolve, reject) => {
      const emptySheetParams = {
        spreadsheetId: process.env.SPREADSHEET_ID,
        resource: {
          requests: [
            {
              addSheet: {
                properties: {
                  title: sheetName,
                  index: 1,
                  gridProperties: {
                    rowCount: 2000,
                    columnCount: 26,
                    frozenRowCount: 1
                  }
                }
              }
            }
          ]
        }
      };
      sheetsAPI.spreadsheets.batchUpdate( emptySheetParams, function(err, response) {
          if (err) {
            reject("The Sheets API returned an error: " + err);
          } else {
            const sheetId = response.data.replies[0].addSheet.properties.sheetId;
            console.log("Created empty sheet: " + sheetId);
            resolve(sheetId);
          }
        });
    });
  }

  function readCSVContent(file) {
    return new Promise((resolve, reject) => {
      const storage = new Storage();
      let fileContents = new Buffer('');
      storage.bucket(file.bucket).file(file.name).createReadStream()
      .on('error', function(err) {
        reject('The Storage API returned an error: ' + err);
      })
      .on('data', function(chunk) {
        fileContents = Buffer.concat([fileContents, chunk]);
      })  
      .on('end', function() {
        let content = fileContents.toString('utf8');
        console.log("CSV content read as string : " + content );
        resolve(content);
      });
    });
  }

  function populateAndStyle(theData, sheetId) {
    return new Promise((resolve, reject) => {
      // Using 'batchUpdate' allows for multiple 'requests' to be sent in a single batch.
      // Populate the sheet referenced by its ID with the data received (a CSV string)
      // Style: set first row font size to 11 and to Bold. Exercise left for the reader: resize columns
      const dataAndStyle = {
        spreadsheetId: process.env.SPREADSHEET_ID,
        resource: {
          requests: [
            {
              pasteData: {
                coordinate: {
                  sheetId: sheetId,
                  rowIndex: 0,
                  columnIndex: 0
                },
                data: theData,
                delimiter: ","
              }
            },
            {
              repeatCell: {
                range: {
                  sheetId: sheetId,
                  startRowIndex: 0,
                  endRowIndex: 1
                },
                cell: {
                  userEnteredFormat: {
                    textFormat: {
                      fontSize: 11,
                      bold: true
                    }
                  }
                },
                fields: "userEnteredFormat(textFormat)"
              }
            }       
          ]
        }
      };
    
      sheetsAPI.spreadsheets.batchUpdate(dataAndStyle, function(err, response) {
        if (err) {
          reject("The Sheets API returned an error: " + err);
        } else {
          console.log(sheetId + " sheet populated with " + theData.length + " rows and column style set.");
          resolve();
        }
      });    
    });
  }
  const sheetId = await addEmptySheet(sheetName);
  const theData = await readCSVContent(data);
  await populateAndStyle(theData, sheetId);
}

Best Regards and Thank you very much ones more for the really helpful example, you clarify very well the point.
Maurizio Cataldo

@beevo
Copy link

beevo commented May 8, 2020

To everyone having an issue when uploading a csv file into the created bucket:
Make sure you add the service account email to the Permissions of the bucket. I set the Role to "Storage Admin."
I had the same issue until I added the service account email as a member to the bucket's permissions.

@gruffie
Copy link

gruffie commented Aug 12, 2020

Thanks for the code. This is the error I am getting:

TypeError: Cannot read property 'endsWith' of undefined
exports.csv2sheet ( /workspace/index.js:6 )

Anyone suggestions wat could be wrong?

@MGCataldo
Copy link

Thanks for the code. This is the error I am getting:

TypeError: Cannot read property 'endsWith' of undefined
exports.csv2sheet ( /workspace/index.js:6 )

Anyone suggestions wat could be wrong?

endsWith is JavaScript method, available Version ECMAScript 6,
check it out

@datagrime
Copy link

Hello,

One issue I have with this script is it does not update new data to the sheets when .csv files are uploaded. For example, if random-sales.csv is updated every day how can we reflect that in the spreadsheet. Thanks!

@fernandosimao
Copy link

Thank you! All works great!

@meljason
Copy link

meljason commented May 3, 2022

Hi, I managed to upload the csv successfully in the cloud storage. However I am getting a permission issue over Google sheet api.

I am getting this error:

 {"severity":"ERROR","message":"Unhandled error The Sheets API returned an error: Error: Insufficient Permission"}

Anybody knows how to solve this?

@Moscovi
Copy link

Moscovi commented Jul 17, 2022

Hello,

One issue I have with this script is it does not update new data to the sheets when .csv files are uploaded. For example, if random-sales.csv is updated every day how can we reflect that in the spreadsheet. Thanks!

Hi, I have the same problem, can anyone help with that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment