Skip to content

Instantly share code, notes, and snippets.

Forked from dboskovic/
Created May 24, 2016
What would you like to do?
KeystoneJS: Cloudinary Cache => Amazon S3

I had a client who I built a site for (ecommerce) that had a lot of high resolution images. (running about 500gb/mo). Cloudinary charges $500/mo for this usage and Amazon charges about $40. I wrote some middleware that I used to wrap my cloudinary urls with in order to enable caching. This is entirely transparent and still enables you to use all the cool cloudinary effect and resizing functions. Hopefully this is useful to someone!

I think using deasync() here is janky but I couldn't think of another way to do it that allowed for quite as easy a fix.

var keystone = require('keystone'),
Types = keystone.Field.Types;
var Imagecache = new keystone.List('Imagecache');
hash: { type: Types.Text, index: true },
uploaded: { type: Types.Boolean, index: true },
canAccessKeystone: { type: Boolean, initial: false }
// add this to ./routes/middleware.js
var crypto = require('crypto');
var request = require('request');
var path = require("path");
var fs = require('fs');
var s3 = require('s3');
var image_cache = keystone.list('Imagecache').model;
var temp_dir = path.join(process.cwd(), 'temp/');
if (!fs.existsSync(temp_dir)) {
var s3_client = s3.createClient({
multipartUploadThreshold: 209715200, // this is the default (20 MB)
multipartUploadSize: 157286400, // this is the default (15 MB)
s3Options: {
accessKeyId: "ACCESS_KEY",
secretAccessKey: "SECRET"
// if you already have an initLocals, just add the function to it
exports.initLocals = function(req,res,next) { = function(img) {
// console.log('looking for image =>',img)
var md5 = crypto.createHash('md5');
var hash = md5.update(img).digest('hex');
var db_image;
function getImage(hash) {
var response;
response = data
while(response === undefined) {
return response;
db_image = getImage(hash)
if(!db_image || !db_image.uploaded) {
if(!db_image) {
// console.log('starting image upload')
request(img).pipe(fs.createWriteStream(temp_dir+"/"+hash+".jpg")).on('close', function (error, response, body) {
var params = {
localFile: temp_dir+"/"+hash+".jpg",
s3Params: {
Bucket: "YOUR_BUCKET",
Key: hash+'.jpg',
var uploader = s3_client.uploadFile(params);
uploader.on('error', function(err) {
uploader.on('end', function() {
console.log('successful image upload',img)
$img.uploaded = true;
// console.log('returning image =>',img)
return img
else {
// console.log('returning image =>',req.protocol+'://'+hash+'.jpg')
return req.protocol+'://'+hash+'.jpg'
// - show a product photo where product has already been loaded from controller and put into scope
// - notice the keystone cloudinary photo method simply returns an http://... url to the cloudinary image
// - the gi() method just requests that url and sends it to s3, and then updates the database when it's available.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment