Skip to content

Instantly share code, notes, and snippets.

@kennyhyun
Last active June 19, 2024 05:55
Show Gist options
  • Save kennyhyun/fa2efbb2e2b0aad1a3e2d2293133618c to your computer and use it in GitHub Desktop.
Save kennyhyun/fa2efbb2e2b0aad1a3e2d2293133618c to your computer and use it in GitHub Desktop.
lamda thumbnail generator

lambda-generate-thumbnail

a lambda deployment package for generating thumbnails

setting up s3 for thumbnail generating

make a zip file

in the directory has index.js

  1. npm init -y
  2. npm install async gm pdf2png --save
  3. make zip including node_modules and index.js named like CreateThumbnail.zip

create role

go to AWS Console IAM

from the official tutorial

  1. Follow the steps in Creating a Role to Delegate Permissions to an AWS Service in the IAM User Guide to create an IAM role (execution role). As you follow the steps to create a role, note the following:
  2. In Role Name, use a name that is unique within your AWS account (for example, lambda-s3-execution-role).
  3. In Select Role Type, choose AWS Service Roles, and then choose AWS Lambda. This grants the AWS Lambda service permissions to assume the role.
  4. In Attach Policy, choose AWSLambdaExecute.
  5. Write down the role ARN. You will need it in the next step when you create your Lambda function.

create lambda

go to AWS Console lambda

  1. create a lambda function
  2. next
  3. select s3 as a trigger
  4. select a bucket like bucketname-assets-test
  5. event type : Object Created (All)
  6. next
  7. create target bucket
  8. create a bucket like bucketname-thumbnails
  9. configure
  10. name such as CreateThumbnail
  11. Runtime: Node.js 4.3
  12. Lambda function: Upload a .ZIP file
  13. select the file (CreateThumbnail.zip)
  14. Lambda function handler and role: Handler: index.handler Role: Choose an existing role existing role: role like lambda-s3-exec-role
  15. Advanced settings: leave it as it is
  16. next
  17. create function

try

  1. set trigger enable
  2. upload a file in to the S3 bucket (like bucketname-assets-test)
  3. you can check log at Cloud Watch
// dependencies
var async = require('async');
var AWS = require('aws-sdk');
// Enable ImageMagick integration.
var gm = require('gm').subClass({ imageMagick: true });
var util = require('util');
var pdf2png = require('pdf2png');
pdf2png.ghostscriptPath = "/usr/bin";
// constants
var MAX_WIDTH = 320;
var MAX_HEIGHT = 320;
// get reference to S3 client
var s3 = new AWS.S3();
exports.handler = function(event, context) {
// Read options from the event.
console.log("Reading options from event:\n", util.inspect(event, {depth: 5}));
var srcBucket = event.Records[0].s3.bucket.name;
// Object key may have spaces or unicode non-ASCII characters.
var srcKey =
decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, " "));
var isTest = srcBucket.indexOf('-test') >= 0;
var dstBucket = srcBucket.split('-')[0];
dstBucket += "-thumbnails";
if (isTest) {
dstBucket += '-test';
}
var dstKey = srcKey;
// Sanity check: validate that source and destination are different buckets.
if (srcBucket == dstBucket) {
console.error("Destination bucket must not match source bucket.");
return;
}
// Infer the image type.
var typeMatch = srcKey.match(/\.([^.]*)$/);
if (!typeMatch) {
console.error('unable to infer image type for key ' + srcKey);
return;
}
var imageType = typeMatch[1].toLowerCase();
if (imageType != "jpg" && imageType != "png" && imageType != "pdf") {
console.log('skipping non-image ' + srcKey);
return;
}
// Download the image from S3, transform, and upload to a different S3 bucket.
async.waterfall([
function download(next) {
// Download the image from S3 into a buffer.
s3.getObject({
Bucket: srcBucket,
Key: srcKey
},
next);
},
function transform(response, next) {
if (imageType == "pdf") {
pdf2png.convert(response.Body, function(resp) {
if (!resp.success) {
next(resp.error);
return;
}
next(null, "image/png", resp.data);
});
} else {
gm(response.Body).size(function(err, size) {
// Infer the scaling factor to avoid stretching the image unnaturally.
var scalingFactor = Math.min(
MAX_WIDTH / size.width,
MAX_HEIGHT / size.height
);
var width = scalingFactor * size.width;
var height = scalingFactor * size.height;
// Transform the image buffer in memory.
this.resize(width, height)
.toBuffer(imageType, function(err, buffer) {
if (err) {
next(err);
} else {
next(null, response.ContentType, buffer);
}
});
});
}
},
function upload(contentType, data, next) {
// Stream the transformed image to a different S3 bucket.
var ext = '';
if (imageType == "pdf") {
ext = '.png';
dstBucket = srcBucket;
}
s3.putObject({
Bucket: dstBucket,
Key: dstKey + ext,
Body: data,
ContentType: contentType,
ACL:'public-read'
},
next);
}
], function (err) {
if (err) {
console.error(
'Unable to resize ' + srcBucket + '/' + srcKey +
' and upload to ' + dstBucket + '/' + dstKey +
' due to an error: ' + err
);
} else {
console.log(
'Successfully resized ' + srcBucket + '/' + srcKey +
' and uploaded to ' + dstBucket + '/' + dstKey
);
}
context.done();
}
);
};
@chewlim
Copy link

chewlim commented Apr 10, 2019

Hey, any plan to upgrade it to Runtime Node.js 8.10?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment