Skip to content

Instantly share code, notes, and snippets.

@jlouros
Last active September 8, 2023 22:48
Show Gist options
  • Star 43 You must be signed in to star a gist
  • Fork 11 You must be signed in to fork a gist
  • Save jlouros/9abc14239b0d9d8947a3345b99c4ebcb to your computer and use it in GitHub Desktop.
Save jlouros/9abc14239b0d9d8947a3345b99c4ebcb to your computer and use it in GitHub Desktop.
Upload folder to S3 (Node.JS)
const AWS = require("aws-sdk"); // from AWS SDK
const fs = require("fs"); // from node.js
const path = require("path"); // from node.js
// configuration
const config = {
s3BucketName: 'your.s3.bucket.name',
folderPath: '../dist' // path relative script's location
};
// initialize S3 client
const s3 = new AWS.S3({ signatureVersion: 'v4' });
// resolve full folder path
const distFolderPath = path.join(__dirname, config.folderPath);
// get of list of files from 'dist' directory
fs.readdir(distFolderPath, (err, files) => {
if(!files || files.length === 0) {
console.log(`provided folder '${distFolderPath}' is empty or does not exist.`);
console.log('Make sure your project was compiled!');
return;
}
// for each file in the directory
for (const fileName of files) {
// get the full path of the file
const filePath = path.join(distFolderPath, fileName);
// ignore if directory
if (fs.lstatSync(filePath).isDirectory()) {
continue;
}
// read file contents
fs.readFile(filePath, (error, fileContent) => {
// if unable to read file contents, throw exception
if (error) { throw error; }
// upload file to S3
s3.putObject({
Bucket: config.s3BucketName,
Key: fileName,
Body: fileContent
}, (res) => {
console.log(`Successfully uploaded '${fileName}'!`);
});
});
}
});
@sarfarazansari
Copy link

there is something missing.
setting configs

const s3 = new AWS.S3({ 
  signatureVersion: 'v4',
  accessKeyId: YOUR_ACCESS_KEY_ID
  secretAccessKey:  YOUR_SECRET_ACCESS_KEY
});

handeling errors if any

// upload file to S3
      s3.upload({
        Bucket: config.s3BucketName,
        Key: fileName,
        Body: fileContent
      }, (err) => {
        console.log (err)
        if (!err) {
          console.log(`Successfully uploaded '${fileName}'!`);
        }
      });

@SergiiVdovareize
Copy link

SergiiVdovareize commented Jun 1, 2018

this will probably break your inner folder structure https://gist.github.com/jlouros/9abc14239b0d9d8947a3345b99c4ebcb#file-aws-upload-folder-to-s3-js-L30
All the objects will be copied to the root path.

@mikhail-angelov
Copy link

it could be replaced by this command
aws s3 sync <folder-path> s3://<bucket-name>

@crsepulv
Copy link

@mikhail-angelov how can I use aws cli (aws s3 sync) from a Lambda function since it is not included by default?

@mikhail-angelov
Copy link

@crsepulv, probably it's had to use aws cli from lambda, I'm not sure you need it there, lambda environment had some restrictions for read/write files

but if you need it there is a way how to make sync via aws-sdk

const AWS = require('aws-sdk')
const co = require('co')
const fs = require('fs')
AWS.config.update({ region: 'us-east-1' })
AWS.config.setPromisesDependency(Promise)
const s3 = new AWS.S3({ apiVersion: '2006-03-01' })

function listFiles(dir, acc) {
  const files = fs.readdirSync(dir) || []
  files.forEach((value) => {
    const name = `${dir}/${value}`
    if (fs.statSync(name).isDirectory()) {
      listFiles(name, acc)
    } else {
      acc.push(name)
    }
  })
  return acc
}

co(function* () {
  const backetId = 'your-bucket'
  const files = listFiles('local-folder', [])
  for (const file of files) {
    console.log('uploading file: ', file)
    yield s3.upload({
      Bucket: backetId,
      Key: file,
      Body: fs.readFileSync(file),
    }).promise()
  }
})
  .catch((err) => {
    console.log('get error: ', err)
  })

@ulver2812
Copy link

Unfortunately the official js aws sdk doesn't have the aws s3 sync <folder-path> s3://<bucket-name> feature.
I'm looking for a node package that allow the use of the sync feature, I have found this:
https://github.com/andrewrk/node-s3-client
but the project seems abandoned and has several not addressed issues.

@hackhat
Copy link

hackhat commented Sep 12, 2018

Anybody found a solution for this? Something that keeps the folder structure.

@hackhat
Copy link

hackhat commented Sep 12, 2018

Added one that works on windows and keeps the file structure intact https://gist.github.com/hackhat/cc0adf1317eeedcec52b1a4ff38f738b

other examples are not working properly on windows.

@valentinbdv
Copy link

Hi hackhat,

I have used this one which is more recent and it has perfectly work for me :
https://www.npmjs.com/package/s3-node-client

Cheerz ;)

@sarkistlt
Copy link

sarkistlt commented Nov 6, 2018

const fs = require('fs');
const path = require('path');
const async = require('async');
const AWS = require('aws-sdk');
const readdir = require('recursive-readdir');

const { BUCKET, KEY, SECRET } = process.env;
const rootFolder = path.resolve(__dirname, './');
const uploadFolder = './upload-folder';
const s3 = new AWS.S3({
  signatureVersion: 'v4',
  accessKeyId: KEY,
  secretAccessKey: SECRET,
});

function getFiles(dirPath) {
  return fs.existsSync(dirPath) ? readdir(dirPath) : [];
}

async function deploy(upload) {
  if (!BUCKET || !KEY || !SECRET) {
    throw new Error('you must provide env. variables: [BUCKET, KEY, SECRET]');
  }

  const filesToUpload = await getFiles(path.resolve(__dirname, upload));

  return new Promise((resolve, reject) => {
    async.eachOfLimit(filesToUpload, 10, async.asyncify(async (file) => {
      const Key = file.replace(`${rootFolder}/`, '');
      console.log(`uploading: [${Key}]`);
      return new Promise((res, rej) => {
        s3.upload({
          Key,
          Bucket: BUCKET,
          Body: fs.readFileSync(file),
        }, (err) => {
          if (err) {
            return rej(new Error(err));
          }
          res({ result: true });
        });
      });
    }), (err) => {
      if (err) {
        return reject(new Error(err));
      }
      resolve({ result: true });
    });
  });
}

deploy(uploadFolder)
  .then(() => {
    console.log('task complete');
    process.exit(0);
  })
  .catch((err) => {
    console.error(err.message);
    process.exit(1);
  });

@larryg01
Copy link

larryg01 commented Jul 1, 2020

@sarkistlt Thanks for this it works great

@alexpopovme
Copy link

This is what I use

const path = require('path')
const util = require('util')
const exec = util.promisify(require('child_process').exec)

const distDir = path.join('someDir', 'dist')
const command = `aws s3 sync ${distDir} s3://bucket-name`

exec(command)
  .then(() => console.log('Deploy complete'))
  .catch(err => {
    console.log(err)
  })

@adrienv1520
Copy link

adrienv1520 commented Oct 17, 2020

Important notes:

  • upload a directory and its sub-directories recursively;
  • could be an absolute or relative path to a directory;
  • params and options are the same as in the AWS documentation so theses functions are very flexible;
  • rootKey is the root AWS key to use, by default it is the S3 root, e.g. saying rootKey is public/images and you want to upload /Users/you/my-project/images, files will be uploaded to s3://bucket/public/images;
  • aws-sdk will automatically check for AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, it is the safest way to deal with credentials imo;
  • without clustering I found uploading a directory of 1254 files was nearly 2 times faster than the native AWS CLI sync method (it's Python underneath, Node.js should be faster);
  • don't forget to add file's content-type, mostly for static websites, or it would be set to application/octet-stream by default and lead to unexpected behaviors;
  • use your favorite debugger/logger over console;
  • const x = { ...params }; is the same as Object.assign BUT will not deeply clone objects which could lead to unexpected object mutations, prefer a safe clone function or similar;
  • tested with Node.js 12.15.0;
  • improve this by clustering the whole upload, some extra code/controls will be needed (based on files' length, number of files, available cores, etc.).
const { createReadStream, promises: { readdir, stat: getStats } } = require('fs');
const { resolve, join } = require('path');
const S3 = require('aws-sdk/clients/s3');
const { getMIMEType } = require('node-mime-types');

const s3 = new S3({
  signatureVersion: 'v4',
});

// upload file
const uploadFile = async function uploadFile({ path, params, options } = {}) {
  const parameters = { ...params };
  const opts = { ...options };

  try {
    const rstream = createReadStream(resolve(path));

    rstream.once('error', (err) => {
      console.error(`unable to upload file ${path}, ${err.message}`);
    });

    parameters.Body = rstream;
    parameters.ContentType = getMIMEType(path);
    await s3.upload(parameters, opts).promise();

    console.info(`${parameters.Key} (${parameters.ContentType}) uploaded in bucket ${parameters.Bucket}`);
  } catch (e) {
    throw new Error(`unable to upload file ${path} at ${parameters.Key}, ${e.message}`);
  }

  return true;
};

// upload directory and its sub-directories if any
const uploadDirectory = async function uploadDirectory({
  path,
  params,
  options,
  rootKey,
} = {}) {
  const parameters = { ...params };
  const opts = { ...options };
  const root = rootKey && rootKey.constructor === String ? rootKey : '';
  let dirPath;

  try {
    dirPath = resolve(path);
    const dirStats = await getStats(dirPath);

    if (!dirStats.isDirectory()) {
      throw new Error(`${dirPath} is not a directory`);
    }

    console.info(`uploading directory ${dirPath}...`);

    const filenames = await readdir(dirPath);

    if (Array.isArray(filenames)) {
      await Promise.all(filenames.map(async (filename) => {
        const filepath = `${dirPath}/${filename}`;
        const fileStats = await getStats(filepath);

        if (fileStats.isFile()) {
          parameters.Key = join(root, filename);
          await uploadFile({
            path: filepath,
            params: parameters,
            options: opts,
          });
        } else if (fileStats.isDirectory()) {
          await uploadDirectory({
            params,
            options,
            path: filepath,
            rootKey: join(root, filename),
          });
        }
      }));
    }
  } catch (e) {
    throw new Error(`unable to upload directory ${path}, ${e.message}`);
  }

  console.info(`directory ${dirPath} successfully uploaded`);
  return true;
};

// example
(async () => {
  try {
    console.time('s3 upload');

    await uploadDirectory({
      path: '../front/dist',
      params: {
        Bucket: 'my-bucket',
      },
      options: {},
      rootKey: '',
    });

    console.timeEnd('s3 upload');
  } catch (e) {
    console.error(e);
  }
})();

@adrienv1520
Copy link

We've just released a package for that at https://github.com/thousandxyz/s3-lambo, fully tested. You can use the code or the package at your need.

@Prozi
Copy link

Prozi commented Aug 30, 2021

I fixed few minor errors with paths / other things and
published this as a package

https://www.npmjs.com/package/s3-upload-folder/v/latest

the source (MIT)

https://github.com/Prozi/s3-upload-folder/blob/main/index.js

uses standard S3 sdk authentication (need to read about this if you don't know what I mean)

enjoy!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment