Skip to content

Instantly share code, notes, and snippets.

Last active September 8, 2023 22:48
Show Gist options
  • Star 43 You must be signed in to star a gist
  • Fork 11 You must be signed in to fork a gist
  • Save jlouros/9abc14239b0d9d8947a3345b99c4ebcb to your computer and use it in GitHub Desktop.
Save jlouros/9abc14239b0d9d8947a3345b99c4ebcb to your computer and use it in GitHub Desktop.
Upload folder to S3 (Node.JS)
const AWS = require("aws-sdk"); // from AWS SDK
const fs = require("fs"); // from node.js
const path = require("path"); // from node.js
// configuration
const config = {
s3BucketName: '',
folderPath: '../dist' // path relative script's location
// initialize S3 client
const s3 = new AWS.S3({ signatureVersion: 'v4' });
// resolve full folder path
const distFolderPath = path.join(__dirname, config.folderPath);
// get of list of files from 'dist' directory
fs.readdir(distFolderPath, (err, files) => {
if(!files || files.length === 0) {
console.log(`provided folder '${distFolderPath}' is empty or does not exist.`);
console.log('Make sure your project was compiled!');
// for each file in the directory
for (const fileName of files) {
// get the full path of the file
const filePath = path.join(distFolderPath, fileName);
// ignore if directory
if (fs.lstatSync(filePath).isDirectory()) {
// read file contents
fs.readFile(filePath, (error, fileContent) => {
// if unable to read file contents, throw exception
if (error) { throw error; }
// upload file to S3
Bucket: config.s3BucketName,
Key: fileName,
Body: fileContent
}, (res) => {
console.log(`Successfully uploaded '${fileName}'!`);
Copy link

it could be replaced by this command
aws s3 sync <folder-path> s3://<bucket-name>

Copy link

@mikhail-angelov how can I use aws cli (aws s3 sync) from a Lambda function since it is not included by default?

Copy link

@crsepulv, probably it's had to use aws cli from lambda, I'm not sure you need it there, lambda environment had some restrictions for read/write files

but if you need it there is a way how to make sync via aws-sdk

const AWS = require('aws-sdk')
const co = require('co')
const fs = require('fs')
AWS.config.update({ region: 'us-east-1' })
const s3 = new AWS.S3({ apiVersion: '2006-03-01' })

function listFiles(dir, acc) {
  const files = fs.readdirSync(dir) || []
  files.forEach((value) => {
    const name = `${dir}/${value}`
    if (fs.statSync(name).isDirectory()) {
      listFiles(name, acc)
    } else {
  return acc

co(function* () {
  const backetId = 'your-bucket'
  const files = listFiles('local-folder', [])
  for (const file of files) {
    console.log('uploading file: ', file)
    yield s3.upload({
      Bucket: backetId,
      Key: file,
      Body: fs.readFileSync(file),
  .catch((err) => {
    console.log('get error: ', err)

Copy link

Unfortunately the official js aws sdk doesn't have the aws s3 sync <folder-path> s3://<bucket-name> feature.
I'm looking for a node package that allow the use of the sync feature, I have found this:
but the project seems abandoned and has several not addressed issues.

Copy link

hackhat commented Sep 12, 2018

Anybody found a solution for this? Something that keeps the folder structure.

Copy link

hackhat commented Sep 12, 2018

Added one that works on windows and keeps the file structure intact

other examples are not working properly on windows.

Copy link

Hi hackhat,

I have used this one which is more recent and it has perfectly work for me :

Cheerz ;)

Copy link

sarkistlt commented Nov 6, 2018

const fs = require('fs');
const path = require('path');
const async = require('async');
const AWS = require('aws-sdk');
const readdir = require('recursive-readdir');

const { BUCKET, KEY, SECRET } = process.env;
const rootFolder = path.resolve(__dirname, './');
const uploadFolder = './upload-folder';
const s3 = new AWS.S3({
  signatureVersion: 'v4',
  accessKeyId: KEY,
  secretAccessKey: SECRET,

function getFiles(dirPath) {
  return fs.existsSync(dirPath) ? readdir(dirPath) : [];

async function deploy(upload) {
  if (!BUCKET || !KEY || !SECRET) {
    throw new Error('you must provide env. variables: [BUCKET, KEY, SECRET]');

  const filesToUpload = await getFiles(path.resolve(__dirname, upload));

  return new Promise((resolve, reject) => {
    async.eachOfLimit(filesToUpload, 10, async.asyncify(async (file) => {
      const Key = file.replace(`${rootFolder}/`, '');
      console.log(`uploading: [${Key}]`);
      return new Promise((res, rej) => {
          Bucket: BUCKET,
          Body: fs.readFileSync(file),
        }, (err) => {
          if (err) {
            return rej(new Error(err));
          res({ result: true });
    }), (err) => {
      if (err) {
        return reject(new Error(err));
      resolve({ result: true });

  .then(() => {
    console.log('task complete');
  .catch((err) => {

Copy link

larryg01 commented Jul 1, 2020

@sarkistlt Thanks for this it works great

Copy link

This is what I use

const path = require('path')
const util = require('util')
const exec = util.promisify(require('child_process').exec)

const distDir = path.join('someDir', 'dist')
const command = `aws s3 sync ${distDir} s3://bucket-name`

  .then(() => console.log('Deploy complete'))
  .catch(err => {

Copy link

adrienv1520 commented Oct 17, 2020

Important notes:

  • upload a directory and its sub-directories recursively;
  • could be an absolute or relative path to a directory;
  • params and options are the same as in the AWS documentation so theses functions are very flexible;
  • rootKey is the root AWS key to use, by default it is the S3 root, e.g. saying rootKey is public/images and you want to upload /Users/you/my-project/images, files will be uploaded to s3://bucket/public/images;
  • aws-sdk will automatically check for AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, it is the safest way to deal with credentials imo;
  • without clustering I found uploading a directory of 1254 files was nearly 2 times faster than the native AWS CLI sync method (it's Python underneath, Node.js should be faster);
  • don't forget to add file's content-type, mostly for static websites, or it would be set to application/octet-stream by default and lead to unexpected behaviors;
  • use your favorite debugger/logger over console;
  • const x = { ...params }; is the same as Object.assign BUT will not deeply clone objects which could lead to unexpected object mutations, prefer a safe clone function or similar;
  • tested with Node.js 12.15.0;
  • improve this by clustering the whole upload, some extra code/controls will be needed (based on files' length, number of files, available cores, etc.).
const { createReadStream, promises: { readdir, stat: getStats } } = require('fs');
const { resolve, join } = require('path');
const S3 = require('aws-sdk/clients/s3');
const { getMIMEType } = require('node-mime-types');

const s3 = new S3({
  signatureVersion: 'v4',

// upload file
const uploadFile = async function uploadFile({ path, params, options } = {}) {
  const parameters = { ...params };
  const opts = { ...options };

  try {
    const rstream = createReadStream(resolve(path));

    rstream.once('error', (err) => {
      console.error(`unable to upload file ${path}, ${err.message}`);

    parameters.Body = rstream;
    parameters.ContentType = getMIMEType(path);
    await s3.upload(parameters, opts).promise();`${parameters.Key} (${parameters.ContentType}) uploaded in bucket ${parameters.Bucket}`);
  } catch (e) {
    throw new Error(`unable to upload file ${path} at ${parameters.Key}, ${e.message}`);

  return true;

// upload directory and its sub-directories if any
const uploadDirectory = async function uploadDirectory({
} = {}) {
  const parameters = { ...params };
  const opts = { ...options };
  const root = rootKey && rootKey.constructor === String ? rootKey : '';
  let dirPath;

  try {
    dirPath = resolve(path);
    const dirStats = await getStats(dirPath);

    if (!dirStats.isDirectory()) {
      throw new Error(`${dirPath} is not a directory`);
    }`uploading directory ${dirPath}...`);

    const filenames = await readdir(dirPath);

    if (Array.isArray(filenames)) {
      await Promise.all( (filename) => {
        const filepath = `${dirPath}/${filename}`;
        const fileStats = await getStats(filepath);

        if (fileStats.isFile()) {
          parameters.Key = join(root, filename);
          await uploadFile({
            path: filepath,
            params: parameters,
            options: opts,
        } else if (fileStats.isDirectory()) {
          await uploadDirectory({
            path: filepath,
            rootKey: join(root, filename),
  } catch (e) {
    throw new Error(`unable to upload directory ${path}, ${e.message}`);
  }`directory ${dirPath} successfully uploaded`);
  return true;

// example
(async () => {
  try {
    console.time('s3 upload');

    await uploadDirectory({
      path: '../front/dist',
      params: {
        Bucket: 'my-bucket',
      options: {},
      rootKey: '',

    console.timeEnd('s3 upload');
  } catch (e) {

Copy link

We've just released a package for that at, fully tested. You can use the code or the package at your need.

Copy link

Prozi commented Aug 30, 2021

I fixed few minor errors with paths / other things and
published this as a package

the source (MIT)

uses standard S3 sdk authentication (need to read about this if you don't know what I mean)


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment