Skip to content

Instantly share code, notes, and snippets.

@homam
Created January 27, 2014 10:08
Show Gist options
  • Save homam/8646090 to your computer and use it in GitHub Desktop.
Save homam/8646090 to your computer and use it in GitHub Desktop.
How to upload files to AWS S3 with NodeJS SDK
var AWS = require('aws-sdk'),
fs = require('fs');
// For dev purposes only
AWS.config.update({ accessKeyId: '...', secretAccessKey: '...' });
// Read in the file, convert it to base64, store to S3
fs.readFile('del.txt', function (err, data) {
if (err) { throw err; }
var base64data = new Buffer(data, 'binary');
var s3 = new AWS.S3();
s3.client.putObject({
Bucket: 'banners-adxs',
Key: 'del2.txt',
Body: base64data,
ACL: 'public-read'
},function (resp) {
console.log(arguments);
console.log('Successfully uploaded package.');
});
});
@dev9846
Copy link

dev9846 commented Mar 1, 2017

I am trying to upload image file. File is successfully uploaded to my S3 bucket but its am empty file which is getting uploaded on s3.

@nueverest
Copy link

@zerosand1s
Copy link

zerosand1s commented May 8, 2017

@dev9846 try using s3.upload() method instead of putObject().
@amulyakashyap09 when using upload() method, once file is uploaded, AWS will return an object which contains url for uploaded file.

@kailashyogeshwar85
Copy link

kailashyogeshwar85 commented May 31, 2017

@zerosand1s i think upload takes stream as argument and putObject can take buffer as input

@ChrisGrigg
Copy link

I find s3.upload() unreliable when used with busboy and 'multipart form' data. It silently fails occasionally and doesn't recover. Anyone know of a solution to this? I may end up using 'putObject' instead.

@nicosoutn2014
Copy link

Hi I'm using the AWS config for the S3 const but I'm still getting "Access Denied" error. I've created a policy for the IAM user and setted it with all S3 actions. Please help :(

Copy link

ghost commented Nov 24, 2017

Did you have an experience with uploading .pdf documents? My problem is that documents get uploaded successfully, but when I'm trying to open them, my browser says "Failed to load PDF document".
Any help would be highly appreciated. I've described my problem in more details here: https://stackoverflow.com/questions/47475595/failed-to-load-pdf-document-using-aws-sdk-in-node-js

@tchaumeny
Copy link

You should use Buffer.from(data, 'binary') as the constructor pattern is deprecated (see https://nodejs.org/api/buffer.html)

@chandankrishnan
Copy link

when I hit image url that from s3 bucket then image not showing
it showing error like "Could not load image 'images.png'".

Any help would be highly appreciated

@SylarRuby
Copy link

@sarfarazansari
Copy link

@almostprogrammer, @SylarRuby, @chandankrishnan
I've created a gist that will work with any kind of file and also work with chunks data.
here is the link https://gist.github.com/sarfarazansari/59d5cf4bb3b03acf069396ca92a79b3e

@jackyan2022
Copy link

You should never put accessKey and secretKey in your code. Don't ask me why I know that.

@Diego04
Copy link

Diego04 commented Jun 18, 2018

simple solution.

`const AWS = require('aws-sdk');
AWS.config.loadFromPath('./config/configS3.json');
s3Bucket = new AWS.S3( { params: {Bucket: 'MY_BUCKET_NAME', timeout: 6000000} } );

fs.readFile('./files/filename.png', function(err, fileData) {
let params = {
ACL: 'public-read',
Key: filename,
Body: fileData,
ContentType: 'binary'
};

   s3Bucket.putObject( params, ( error, data ) => {
        if( error ) console.log( error );

        callaback( null, data );
   });

});
`

@coolaj86
Copy link

coolaj86 commented Sep 16, 2019

The Right Way (that actually works)

I'm a senior JS dev and I thought I'd just do a quick search to find a copy/paste snippet for uploading to S3.
What I've found instead (collectively, not specifically this example) has horrified me, so I decided to write my own and share:

First, I recommend using a .env file for the configuration details (and never commit that file):

.env:

AWS_ACCESS_KEY=xxxxxxxxxxxxxxxx
AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxx

.gitignore:

.env
.env.*

And this is the actual code snippet:

'use strict';

// This will read the .env (if it exists) into process.env
require('dotenv').config();

// These values will be either what's in .env,
// or what's in the Docker, Heroku, AWS environment
var AWS_ACCESS_KEY = process.env.AWS_ACCESS_KEY;
var AWS_SECRET_ACCESS_KEY = process.env.AWS_SECRET_ACCESS_KEY;

var AWS = require('aws-sdk');
var s3 = new AWS.S3({
    accessKeyId: AWS_ACCESS_KEY,
    secretAccessKey: AWS_SECRET_ACCESS_KEY
});

var fs = require('fs');
var path = require('path');

function uploadToS3(bucketName, keyPrefix, filePath) {
    // ex: /path/to/my-picture.png becomes my-picture.png
    var fileName = path.basename(filePath);
    var fileStream = fs.createReadStream(filePath);

    // If you want to save to "my-bucket/{prefix}/{filename}"
    //                    ex: "my-bucket/my-pictures-folder/my-picture.png"
    var keyName = path.join(keyPrefix, fileName);

    // We wrap this in a promise so that we can handle a fileStream error
    // since it can happen *before* s3 actually reads the first 'data' event
    return new Promise(function(resolve, reject) {
        fileStream.once('error', reject);
        s3.upload(
            {
                Bucket: bucketName,
                Key: keyName,
                Body: fileStream
            }
        ).promise().then(resolve, reject);
    });
}

Usage:

uploadToS3("my-bucket-name", "", "../../be/careful/pic.jpg").then(function (result) {
  console.log("Uploaded to s3:", result.location);
}).catch(function (err) {
  console.error("something bad happened:", err.toString());
});

See https://coolaj86.com/articles/upload-to-s3-with-node-the-right-way/ if you've tried and failed a few times with what you've found among the top google results and what to know why pretty much none of them work.

Of course, you could use lots of const, await, hashrockets, and arrow functions as well, but I prefer plain JS because it's easy even for novices from other languages to read and understand.

@Suleman24-dev
Copy link

how to get public key once files is uploaded

@shelooks16
Copy link

how to get public key once files is uploaded

  public async uploadObject(
    body: Buffer,
    bucket: string,
    key: string,
    mimeType: string
  ) {
    const params = {
      Bucket: bucket,
      Key: key,
      ACL: 'public-read',
      Body: body,
      ContentType: mimeType
    };
    return await this.s3
      .upload(params)
      .promise()
      .then((data) => data.Location) // <-- public key is available under data.Location
      .catch((err) => {
        throw new Error(err.message);
      });
  }

@Suleman24-dev
Copy link

Suleman24-dev commented Oct 7, 2019 via email

@VincentSit
Copy link

@solderjs Thank you for your kind sharing, which is very helpful for junior devs.

@LacikIgor
Copy link

LacikIgor commented Jan 22, 2020

The Right Way (that actually works)

I'm a senior JS dev and I thought I'd just do a quick search to find a copy/paste snippet for uploading to S3. What I've found instead (collectively, not specifically this example) has horrified me, so I decided to write my own and share:

First, I recommend using a .env file for the configuration details (and never commit that file):

.env:

AWS_ACCESS_KEY=xxxxxxxxxxxxxxxx
AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxx

And this is the actual code snippet:

'use strict';

// This will read the .env (if it exists) into process.env
require('dotenv').config();

// These values will be either what's in .env,
// or what's in the Docker, Heroku, AWS environment
var AWS_ACCESS_KEY = process.env.AWS_ACCESS_KEY;
var AWS_SECRET_ACCESS_KEY = process.env.AWS_SECRET_ACCESS_KEY;

var AWS = require('aws-sdk');
var s3 = new AWS.S3({
    accessKeyId: AWS_ACCESS_KEY,
    secretAccessKey: AWS_SECRET_ACCESS_KEY
});

var fs = require('fs');

function uploadToS3(bucketName, keyPrefix, filePath) {
    // ex: /path/to/my-picture.png becomes my-picture.png
    var fileName = path.basename(filePath);
    var fileStream = fs.createReadStream(filePath);

    // If you want to save to "my-bucket/{prefix}/{filename}"
    //                    ex: "my-bucket/my-pictures-folder/my-picture.png"
    var keyName = path.join(keyPrefix, fileName);

    return new Promise(function(resolve, reject) {
        fileStream.once('error', reject);
        s3.upload(
            {
                Bucket: bucketName,
                Key: keyName,
                Body: fileStream
            },
            function(err, result) {
                if (err) {
                    reject(err);
                    return;
                }

                resolve(result);
            }
        );
    });
}

See https://coolaj86.com/articles/upload-to-s3-with-node-the-right-way/ if you've tried a few times with what you've found the top google results and what to know why... pretty much none of them work.

Of course, you could use lots of const, await, hashrockets, and arrow functions as well, but I prefer plain JS because it's easy even for novices from other languages to read and understand.

In:

var fileName = path.basename(filePath);

Where do u get the path from?

@lowkeyshift
Copy link

@LacikIgor

In:

var fileName = path.basename(filePath);
Where do u get the path from?

The path is library being called const path = require('path'). Or if you are asking about the actual path. Then just manually or use a variable to reference the actual path to file.

@evolross
Copy link

What about uploading via a URL? I'm trying to figure out how to pass an image URL from the web and upload that image to S3. Many free image API libraries require this (e.g. pixabay).

@coolaj86
Copy link

coolaj86 commented Jul 28, 2020

@evolross You can either open up an http request and download it as a file first, or pipe it.

Honestly, I think it's to your benefit to just do this with the bare node https api via pipes. If you want something quick and dirty for downloads, @root/request will get the job done. Or axios. Or whatever else people are using these days. They don't much matter. I rolled my own to be lightweight, request compatible, and to have 0 dependencies.

@bmitchinson
Copy link

bmitchinson commented Jul 30, 2020

Thanks solderjs, using a fileStream worked perfect for me.

Typescript snippet:

async putFileInBucket(awsS3: AWS.S3, bucketName: string, fileName: string, fileExtension: string, fileStream: fs.ReadStream): Promise<void> {
        await awsS3.putObject({
              Body: fileStream,
              Bucket: bucketName,
              Key: `${fileName}.${fileExtension}`
        }).promise()
        .catch(err: AWSError => {
            throw new AWSException(err, "problem uploading file")
         })
    }

const fileStream = fs.createReadStream(path.join(__dirname, "sample.wav"))
await this.putFileInBucket(awsS3, bucketName, "myFile, "wav", fileStream)

@reddeiah18g
Copy link

Hi team,
download pdf file from website , zip those pdf files and upload those zip file S3 bucket.
some one help me on that issue.

@reddeiah18g
Copy link

Hi team,
download pdf file from website , zip those pdf files and upload those zip file S3 bucket.
some one help me on that issue.
using Lambda function with node.js

@sadiaRia
Copy link

it works for me :)

uploadContentFromFilePath = (fileName) => {
const fileContent = fs.createReadStream(${fileName});
return new Promise(function (resolve, reject) {
fileContent.once('error', reject);
s3.upload(
{
Bucket: 'test-bucket',
Key: ${fileName + '_' + Date.now().toString()},
ContentType: 'application/pdf',
ACL: 'public-read',
Body: fileContent
},
function (err, result) {
if (err) {
reject(err);
return;
}
resolve(result.Location);
}
);
});
}

@haojunfu
Copy link

if u get error 'putObject is not defined' , you can write like this ,

var s3 = new AWS.S3();

s3.putObject({
Bucket: 'xxx',
Key: 'xxx',
Body: 'what you want to upload',
},function () {
console.log('Successfully uploaded package.');
});

@DYW972
Copy link

DYW972 commented Jan 31, 2022

Thank you, everybody, for your comments 🙌

@coolaj86 Thank you for your snippet, I just wonder how to use it with an input file form?

Thanks

@brysonbw
Copy link

By far the best simple and straightforward code/implementation and brief explanation I've found. Been stuck on this for 2-3 days lol Thanks guys - peace and love

@kotnibf
Copy link

kotnibf commented May 22, 2023

How can I create and get s3 bucket id?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment