Skip to content

Instantly share code, notes, and snippets.

@alexismp
Last active November 8, 2022 21:58
Show Gist options
  • Save alexismp/86315a74cdf887003b11f4ceade924de to your computer and use it in GitHub Desktop.
Save alexismp/86315a74cdf887003b11f4ceade924de to your computer and use it in GitHub Desktop.
Node.js 8 Cloud Function to write to a Google Sheets document
// Copyright 2018 Google LLC.
// SPDX-License-Identifier: Apache-2.0
const { google } = require("googleapis");
const { Storage } = require("@google-cloud/storage");
exports.csv2sheet = async (data, context) => {
var fileName = data.name;
// basic check that this is a *.csv file, etc...
if (!fileName.endsWith(".csv")) {
console.log("Not a .csv file, ignoring.");
return;
}
// define name of new sheet
const sheetName = fileName.slice(0, -4);
// block on auth + getting the sheets API object
const auth = await google.auth.getClient({
scopes: [
"https://www.googleapis.com/auth/spreadsheets",
"https://www.googleapis.com/auth/devstorage.read_only"
]
});
const sheetsAPI = google.sheets({ version: "v4", auth });
// create a new sheet and remember its ID (based on the filename, removing the .csv extension)
const sheetId = await addEmptySheet(sheetsAPI, sheetName);
const theData = await readCSVContent(sheetsAPI, data, sheetName);
await populateAndStyle(sheetsAPI, theData, sheetId);
};
// read data from the CSV file uploaded to the storage bucket
// and returns a string of CSV values with carriage returns
function readCSVContent(sheetsAPI, file, sheetName) {
return new Promise((resolve, reject) => {
const storage = new Storage();
let fileContents = new Buffer('');
storage.bucket(file.bucket).file(file.name).createReadStream()
.on('error', function(err) {
reject('The Storage API returned an error: ' + err);
})
.on('data', function(chunk) {
fileContents = Buffer.concat([fileContents, chunk]);
})
.on('end', function() {
let content = fileContents.toString('utf8');
console.log("CSV content read as string : " + content );
resolve(content);
});
});
}
// Creates a new sheet in the spreadsheet with the given name at position 2,
// with 26 colums and 2000 rows with the first row frozen.
// Returns its sheetId
function addEmptySheet(sheetsAPI, sheetName) {
return new Promise((resolve, reject) => {
const addEmptySheetParams = {
// reading SHEET_ID from function environment variable
spreadsheetId: process.env.SPREADSHEET_ID,
resource: {
requests: [
{
addSheet: {
properties: {
title: sheetName,
index: 1,
gridProperties: {
rowCount: 2000,
columnCount: 26,
frozenRowCount: 1
}
}
}
}
]
}
};
sheetsAPI.spreadsheets.batchUpdate(addEmptySheetParams, function(
err,
response
) {
if (err) {
reject("The Sheets API returned an error: " + err);
} else {
const sheetId = response.data.replies[0].addSheet.properties.sheetId;
console.log("Created empty sheet: " + sheetId);
resolve(sheetId);
}
});
});
}
function populateAndStyle(sheetsAPI, theData, sheetId) {
return new Promise((resolve, reject) => {
// Using 'batchUpdate' allows for multiple 'requests' to be sent in a single batch.
// Populate the sheet referenced by its ID with the data received (a CSV string)
// Style: set first row font size to 11 and to Bold. Exercise left for the reader: resize columns
const dataAndStyle = {
spreadsheetId: process.env.SPREADSHEET_ID,
resource: {
requests: [
{
pasteData: {
coordinate: {
sheetId: sheetId,
rowIndex: 0,
columnIndex: 0
},
data: theData,
delimiter: ","
}
},
{
repeatCell: {
range: {
sheetId: sheetId,
startRowIndex: 0,
endRowIndex: 1
},
cell: {
userEnteredFormat: {
textFormat: {
fontSize: 11,
bold: true
}
}
},
fields: "userEnteredFormat(textFormat)"
}
}
]
}
};
sheetsAPI.spreadsheets.batchUpdate(dataAndStyle, function(err, response) {
if (err) {
reject("The Sheets API returned an error: " + err);
} else {
console.log(sheetId + " sheet populated with " + theData.length + " rows and column style set.");
resolve();
}
});
});
}
@SieteCarly
Copy link

Thanks for the tutorial. I tried following this, but I'm getting the following error:


Deployment failure:
Function failed on loading user code. Error message: Code in file index.js can't be loaded.
Is there a syntax error in your code?
Detailed stack trace: /user_code/index.js:6
exports.function2sheet = async (data, context) => {
                               ^

SyntaxError: Unexpected token (
    at createScript (vm.js:56:10)
    at Object.runInThisContext (vm.js:97:10)
    at Module._compile (module.js:549:28)
    at Object.Module._extensions..js (module.js:586:10)
    at Module.load (module.js:494:32)
    at tryModuleLoad (module.js:453:12)
    at Function.Module._load (module.js:445:3)
    at Module.require (module.js:504:17)
    at require (internal/module.js:20:19)
    at getUserFunction (/var/tmp/worker/worker.js:439:24)

@abarro
Copy link

abarro commented Apr 10, 2019

@alexismp
Copy link
Author

alexismp commented Jun 7, 2019

I've updated the snippet and just tested that it works now (also currently updating the codelab)
Let me know if it still doesn't work.

@nanotower
Copy link

Hi, Alex! I´m learning js and cloud and I don´t understand what do I have to do with this snippets. Just use it in gloogle functions body?

@cbuffone
Copy link

cbuffone commented Dec 4, 2019

hi i have some rather basic questions on this...i was able to deploy the function without errors, but when i load a csv into my bucket...nothing happens.

Am i supposed to change anything in the above code? I'ts not entirely clear (to me) which portions are placeholders for MY INFORMATION vs. which portions need to remain as-is.....does the following need to included somewhere:

  1. actual csv file name that i am using
  2. actual bucket i am using
  3. actual sheet id i am using

thanks in advance!

@nanotower
Copy link

I´m there too. I´ve paste the code and dependencies, but nothing happens if I upload more content into the proper bucket. Help, please!

@PolarBearGG
Copy link

Same problem, pls help

@QPeiran
Copy link

QPeiran commented Jan 7, 2020

it works fine for me, no need to change anything in the snippet @cbuffone

@MGCataldo
Copy link

MGCataldo commented Apr 27, 2020

Thank you alexismp for the great Job!

Everything works like a charm, I'd suggest just some (honestly irrelevant) corrections to the code.
In details.

  • Variable sheetsAPI and sheetName are global for the script, there is no need to pass them in to the functions.
  • Function addEmptySheet does not requires sheetsAPI variable, avoid passing in.
  • Function readCSVContent does not requires sheetsAPI and sheetName variables for the execution, avoid passing in.
  • Function populateAndStyle no need to pass sheetsAPI variable, it's global...

Function declarations become

  • function addEmptySheet(sheetName)
  • function readCSVContent(file)
  • function populateAndStyle(theData, sheetId)

Final blocking calls become

  const sheetId = await addEmptySheet(sheetName);
  const theData = await readCSVContent(data);
  await populateAndStyle(theData, sheetId);

The whole code become:

const {google} = require("googleapis");
const {Storage} = require("@google-cloud/storage");

exports.csv2sheet = async (data,context) => {
  var filename = data.name;
  if (!filename.endsWith(".csv")){
    console.log("Not a .csv file, ignoring.");
    return;
  }

  // define name of new sheet
  const sheetName = filename.slice(0,-4);

  // block on auth + getting the sheets API object
  const auth = await google.auth.getClient({
    scopes: [
      "https://www.googleapis.com/auth/spreadsheets",
      "https://www.googleapis.com/auth/devstorage.read_only"
    ]
  });
  
  const sheetsAPI = google.sheets({version: 'v4',auth});

  function addEmptySheet(sheetName) {
    return new Promise((resolve, reject) => {
      const emptySheetParams = {
        spreadsheetId: process.env.SPREADSHEET_ID,
        resource: {
          requests: [
            {
              addSheet: {
                properties: {
                  title: sheetName,
                  index: 1,
                  gridProperties: {
                    rowCount: 2000,
                    columnCount: 26,
                    frozenRowCount: 1
                  }
                }
              }
            }
          ]
        }
      };
      sheetsAPI.spreadsheets.batchUpdate( emptySheetParams, function(err, response) {
          if (err) {
            reject("The Sheets API returned an error: " + err);
          } else {
            const sheetId = response.data.replies[0].addSheet.properties.sheetId;
            console.log("Created empty sheet: " + sheetId);
            resolve(sheetId);
          }
        });
    });
  }

  function readCSVContent(file) {
    return new Promise((resolve, reject) => {
      const storage = new Storage();
      let fileContents = new Buffer('');
      storage.bucket(file.bucket).file(file.name).createReadStream()
      .on('error', function(err) {
        reject('The Storage API returned an error: ' + err);
      })
      .on('data', function(chunk) {
        fileContents = Buffer.concat([fileContents, chunk]);
      })  
      .on('end', function() {
        let content = fileContents.toString('utf8');
        console.log("CSV content read as string : " + content );
        resolve(content);
      });
    });
  }

  function populateAndStyle(theData, sheetId) {
    return new Promise((resolve, reject) => {
      // Using 'batchUpdate' allows for multiple 'requests' to be sent in a single batch.
      // Populate the sheet referenced by its ID with the data received (a CSV string)
      // Style: set first row font size to 11 and to Bold. Exercise left for the reader: resize columns
      const dataAndStyle = {
        spreadsheetId: process.env.SPREADSHEET_ID,
        resource: {
          requests: [
            {
              pasteData: {
                coordinate: {
                  sheetId: sheetId,
                  rowIndex: 0,
                  columnIndex: 0
                },
                data: theData,
                delimiter: ","
              }
            },
            {
              repeatCell: {
                range: {
                  sheetId: sheetId,
                  startRowIndex: 0,
                  endRowIndex: 1
                },
                cell: {
                  userEnteredFormat: {
                    textFormat: {
                      fontSize: 11,
                      bold: true
                    }
                  }
                },
                fields: "userEnteredFormat(textFormat)"
              }
            }       
          ]
        }
      };
    
      sheetsAPI.spreadsheets.batchUpdate(dataAndStyle, function(err, response) {
        if (err) {
          reject("The Sheets API returned an error: " + err);
        } else {
          console.log(sheetId + " sheet populated with " + theData.length + " rows and column style set.");
          resolve();
        }
      });    
    });
  }
  const sheetId = await addEmptySheet(sheetName);
  const theData = await readCSVContent(data);
  await populateAndStyle(theData, sheetId);
}

Best Regards and Thank you very much ones more for the really helpful example, you clarify very well the point.
Maurizio Cataldo

@beevo
Copy link

beevo commented May 8, 2020

To everyone having an issue when uploading a csv file into the created bucket:
Make sure you add the service account email to the Permissions of the bucket. I set the Role to "Storage Admin."
I had the same issue until I added the service account email as a member to the bucket's permissions.

@gruffie
Copy link

gruffie commented Aug 12, 2020

Thanks for the code. This is the error I am getting:

TypeError: Cannot read property 'endsWith' of undefined
exports.csv2sheet ( /workspace/index.js:6 )

Anyone suggestions wat could be wrong?

@MGCataldo
Copy link

Thanks for the code. This is the error I am getting:

TypeError: Cannot read property 'endsWith' of undefined
exports.csv2sheet ( /workspace/index.js:6 )

Anyone suggestions wat could be wrong?

endsWith is JavaScript method, available Version ECMAScript 6,
check it out

@datagrime
Copy link

Hello,

One issue I have with this script is it does not update new data to the sheets when .csv files are uploaded. For example, if random-sales.csv is updated every day how can we reflect that in the spreadsheet. Thanks!

@fernandosimao
Copy link

Thank you! All works great!

@meljason
Copy link

meljason commented May 3, 2022

Hi, I managed to upload the csv successfully in the cloud storage. However I am getting a permission issue over Google sheet api.

I am getting this error:

 {"severity":"ERROR","message":"Unhandled error The Sheets API returned an error: Error: Insufficient Permission"}

Anybody knows how to solve this?

@Moscovi
Copy link

Moscovi commented Jul 17, 2022

Hello,

One issue I have with this script is it does not update new data to the sheets when .csv files are uploaded. For example, if random-sales.csv is updated every day how can we reflect that in the spreadsheet. Thanks!

Hi, I have the same problem, can anyone help with that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment