Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
How to upload files to AWS S3 with NodeJS SDK
var AWS = require('aws-sdk'),
fs = require('fs');
// For dev purposes only
AWS.config.update({ accessKeyId: '...', secretAccessKey: '...' });
// Read in the file, convert it to base64, store to S3
fs.readFile('del.txt', function (err, data) {
if (err) { throw err; }
var base64data = new Buffer(data, 'binary');
var s3 = new AWS.S3();
s3.client.putObject({
Bucket: 'banners-adxs',
Key: 'del2.txt',
Body: base64data,
ACL: 'public-read'
},function (resp) {
console.log(arguments);
console.log('Successfully uploaded package.');
});
});
@amulyakashyap09

This comment has been minimized.

Copy link

amulyakashyap09 commented Jan 29, 2017

Now how we'll get public url of uploaded file ?

@Keramet

This comment has been minimized.

Copy link

Keramet commented Feb 9, 2017

https://s3-{your-region}.amazonaws.com/banners-adxs/del2.txt

P.S. I think, better use s3.putObject({ ... }) - without client.

@dev9846

This comment has been minimized.

Copy link

dev9846 commented Mar 1, 2017

I am trying to upload image file. File is successfully uploaded to my S3 bucket but its am empty file which is getting uploaded on s3.

@nueverest

This comment has been minimized.

@zerosand1s

This comment has been minimized.

Copy link

zerosand1s commented May 8, 2017

@dev9846 try using s3.upload() method instead of putObject().
@amulyakashyap09 when using upload() method, once file is uploaded, AWS will return an object which contains url for uploaded file.

@kailashyogeshwar85

This comment has been minimized.

Copy link

kailashyogeshwar85 commented May 31, 2017

@zerosand1s i think upload takes stream as argument and putObject can take buffer as input

@ChrisGrigg

This comment has been minimized.

Copy link

ChrisGrigg commented Oct 2, 2017

I find s3.upload() unreliable when used with busboy and 'multipart form' data. It silently fails occasionally and doesn't recover. Anyone know of a solution to this? I may end up using 'putObject' instead.

@nicosoutn2014

This comment has been minimized.

Copy link

nicosoutn2014 commented Oct 15, 2017

Hi I'm using the AWS config for the S3 const but I'm still getting "Access Denied" error. I've created a policy for the IAM user and setted it with all S3 actions. Please help :(

@ghost

This comment has been minimized.

Copy link

ghost commented Nov 24, 2017

Did you have an experience with uploading .pdf documents? My problem is that documents get uploaded successfully, but when I'm trying to open them, my browser says "Failed to load PDF document".
Any help would be highly appreciated. I've described my problem in more details here: https://stackoverflow.com/questions/47475595/failed-to-load-pdf-document-using-aws-sdk-in-node-js

@tchaumeny

This comment has been minimized.

Copy link

tchaumeny commented Dec 11, 2017

You should use Buffer.from(data, 'binary') as the constructor pattern is deprecated (see https://nodejs.org/api/buffer.html)

@chandankrishnan

This comment has been minimized.

Copy link

chandankrishnan commented Dec 18, 2017

when I hit image url that from s3 bucket then image not showing
it showing error like "Could not load image 'images.png'".

Any help would be highly appreciated

@SylarRuby

This comment has been minimized.

Copy link

SylarRuby commented Dec 28, 2017

@sarfarazansari

This comment has been minimized.

Copy link

sarfarazansari commented Apr 20, 2018

@almostprogrammer, @SylarRuby, @chandankrishnan
I've created a gist that will work with any kind of file and also work with chunks data.
here is the link https://gist.github.com/sarfarazansari/59d5cf4bb3b03acf069396ca92a79b3e

@hebe889900

This comment has been minimized.

Copy link

hebe889900 commented Jun 12, 2018

You should never put accessKey and secretKey in your code. Don't ask me why I know that.

@Diego04

This comment has been minimized.

Copy link

Diego04 commented Jun 18, 2018

simple solution.

`const AWS = require('aws-sdk');
AWS.config.loadFromPath('./config/configS3.json');
s3Bucket = new AWS.S3( { params: {Bucket: 'MY_BUCKET_NAME', timeout: 6000000} } );

fs.readFile('./files/filename.png', function(err, fileData) {
let params = {
ACL: 'public-read',
Key: filename,
Body: fileData,
ContentType: 'binary'
};

   s3Bucket.putObject( params, ( error, data ) => {
        if( error ) console.log( error );

        callaback( null, data );
   });

});
`

@solderjs

This comment has been minimized.

Copy link

solderjs commented Sep 16, 2019

The Right Way (that actually works)

I'm a senior JS dev and I thought I'd just do a quick search to find a copy/paste snippet for uploading to S3. What I've found instead (collectively, not specifically this example) has horrified me, so I decided to write my own and share:

First, I recommend using a .env file for the configuration details (and never commit that file):

.env:

AWS_ACCESS_KEY=xxxxxxxxxxxxxxxx
AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxx

And this is the actual code snippet:

'use strict';

// This will read the .env (if it exists) into process.env
require('dotenv').config();

// These values will be either what's in .env,
// or what's in the Docker, Heroku, AWS environment
var AWS_ACCESS_KEY = process.env.AWS_ACCESS_KEY;
var AWS_SECRET_ACCESS_KEY = process.env.AWS_SECRET_ACCESS_KEY;

var AWS = require('aws-sdk');
var s3 = new AWS.S3({
    accessKeyId: AWS_ACCESS_KEY,
    secretAccessKey: AWS_SECRET_ACCESS_KEY
});

var fs = require('fs');
var path = require('path');

function uploadToS3(bucketName, keyPrefix, filePath) {
    // ex: /path/to/my-picture.png becomes my-picture.png
    var fileName = path.basename(filePath);
    var fileStream = fs.createReadStream(filePath);

    // If you want to save to "my-bucket/{prefix}/{filename}"
    //                    ex: "my-bucket/my-pictures-folder/my-picture.png"
    var keyName = path.join(keyPrefix, fileName);

    return new Promise(function(resolve, reject) {
        fileStream.once('error', reject);
        s3.upload(
            {
                Bucket: bucketName,
                Key: keyName,
                Body: fileStream
            }
        ).promise().then(resolve, reject);
    });
}

See https://coolaj86.com/articles/upload-to-s3-with-node-the-right-way/ if you've tried and failed a few times with what you've found among the top google results and what to know why pretty much none of them work.

Of course, you could use lots of const, await, hashrockets, and arrow functions as well, but I prefer plain JS because it's easy even for novices from other languages to read and understand.

@Suleman24-dev

This comment has been minimized.

Copy link

Suleman24-dev commented Sep 27, 2019

how to get public key once files is uploaded

@shelooks16

This comment has been minimized.

Copy link

shelooks16 commented Oct 6, 2019

how to get public key once files is uploaded

  public async uploadObject(
    body: Buffer,
    bucket: string,
    key: string,
    mimeType: string
  ) {
    const params = {
      Bucket: bucket,
      Key: key,
      ACL: 'public-read',
      Body: body,
      ContentType: mimeType
    };
    return await this.s3
      .upload(params)
      .promise()
      .then((data) => data.Location) // <-- public key is available under data.Location
      .catch((err) => {
        throw new Error(err.message);
      });
  }
@Suleman24-dev

This comment has been minimized.

Copy link

Suleman24-dev commented Oct 7, 2019

@VincentSit

This comment has been minimized.

Copy link

VincentSit commented Oct 18, 2019

@solderjs Thank you for your kind sharing, which is very helpful for junior devs.

@LacikIgor

This comment has been minimized.

Copy link

LacikIgor commented Jan 22, 2020

The Right Way (that actually works)

I'm a senior JS dev and I thought I'd just do a quick search to find a copy/paste snippet for uploading to S3. What I've found instead (collectively, not specifically this example) has horrified me, so I decided to write my own and share:

First, I recommend using a .env file for the configuration details (and never commit that file):

.env:

AWS_ACCESS_KEY=xxxxxxxxxxxxxxxx
AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxx

And this is the actual code snippet:

'use strict';

// This will read the .env (if it exists) into process.env
require('dotenv').config();

// These values will be either what's in .env,
// or what's in the Docker, Heroku, AWS environment
var AWS_ACCESS_KEY = process.env.AWS_ACCESS_KEY;
var AWS_SECRET_ACCESS_KEY = process.env.AWS_SECRET_ACCESS_KEY;

var AWS = require('aws-sdk');
var s3 = new AWS.S3({
    accessKeyId: AWS_ACCESS_KEY,
    secretAccessKey: AWS_SECRET_ACCESS_KEY
});

var fs = require('fs');

function uploadToS3(bucketName, keyPrefix, filePath) {
    // ex: /path/to/my-picture.png becomes my-picture.png
    var fileName = path.basename(filePath);
    var fileStream = fs.createReadStream(filePath);

    // If you want to save to "my-bucket/{prefix}/{filename}"
    //                    ex: "my-bucket/my-pictures-folder/my-picture.png"
    var keyName = path.join(keyPrefix, fileName);

    return new Promise(function(resolve, reject) {
        fileStream.once('error', reject);
        s3.upload(
            {
                Bucket: bucketName,
                Key: keyName,
                Body: fileStream
            },
            function(err, result) {
                if (err) {
                    reject(err);
                    return;
                }

                resolve(result);
            }
        );
    });
}

See https://coolaj86.com/articles/upload-to-s3-with-node-the-right-way/ if you've tried a few times with what you've found the top google results and what to know why... pretty much none of them work.

Of course, you could use lots of const, await, hashrockets, and arrow functions as well, but I prefer plain JS because it's easy even for novices from other languages to read and understand.

In:

var fileName = path.basename(filePath);

Where do u get the path from?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.