Skip to content

Instantly share code, notes, and snippets.

@cben
Created April 16, 2015 08:35
Show Gist options
  • Save cben/df1f2b2d8765bfdb3631 to your computer and use it in GitHub Desktop.
Save cben/df1f2b2d8765bfdb3631 to your computer and use it in GitHub Desktop.

[since the source blog doesn't support comments and the margins of a tweet are too narrow]

Hmm, interesting distinction. But I don't see why your way is better than having the CI build the image.

  • It is more flexible in allowing me to build the image any ad-hoc way I want, e.g. modify files inside the container and take a snapshot.
    But my primary Dev instinct is that I don't want this. The last thing I want in prod is an image I don't have sources for. I'm not thinking in terms "test this image before deploying" but in terms of "test this source code". Which means build an image from Dockerfile in a clean env, test this "clean" image (and yes, push this same image to prod).

  • Does your way improve dev/prod parity?
    After all, while developing I was probably doing some manual "informal testing" on my dev image. Shouldn't then prod image == dev image? Perhaps, but

    1. I'd take a clean build reproducible over bit-for-bit dev/prod parity any day.
    2. If one really insists, one could use CI builds in local developement! For big images that might be painfully slow [^1], and docker gives me enough confidence that local builds ~= CI builds to skip it.

[^1] BTW, network speed is a problem for your design. For big images and users on typical internet connections, uploading just sources and having them built and deployed in the cloud will be faster that building a big image locally and uploading that. rsync and especially bup can optimize upload of similar images.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment