I had a client who I built a site for (ecommerce) that had a lot of high resolution images. (running about 500gb/mo). Cloudinary charges $500/mo for this usage and Amazon charges about $40. I wrote some middleware that I used to wrap my cloudinary urls with in order to enable caching. This is entirely transparent and still enables you to use all the cool cloudinary effect and resizing functions. Hopefully this is useful to someone!
I think using deasync()
here is janky but I couldn't think of another way to do it that allowed for quite as easy a fix.
This is a neat solution for the Cloudinary images in particular but I imagine most sites with bandwidth issues would be better served with a basic CDN setup or a reverse proxy. AWS will charge you $0.09-0.25 per GB/month (for first 10 TB) for transfers while a $20/month plan with CloudFlare gives you unlimited transfers. Something like this is quite transparent and can be setup easily.
On the flip side, if the middleware code above was refactored to use the offical AWS SDK package, you could leverage the ability to create signed URLs for S3 objects. Doing so would let you generate a unique, time-limited URL each time an image was referenced. This has some interesting effects:
Just an idea..