Skip to content

Instantly share code, notes, and snippets.

Last active April 8, 2024 09:52
Show Gist options
  • Save duluca/d13e501e870215586271b0f9ce1781ce to your computer and use it in GitHub Desktop.
Save duluca/d13e501e870215586271b0f9ce1781ce to your computer and use it in GitHub Desktop.
npm scripts for Docker

These are generic npm scripts that you can copy & paste into your package.json file as-is and get access to convinience scripts to manage your Docker images all in one place.

How to Use

npm i -g mrm-task-npm-docker
npx mrm npm-docker


Here's the code repository

Looking for npm scripts for AWS ECS? Go here!

Watch the video: Do More With Less JavaScript

Get the book: These scripts are referenced in my book Angular for Enterprise-Ready Web Applications. You can get it on

Evergreen Docker Containers for Static or Angular/React/Vue/etc SPA Websites

These containers are always up-to-date with the base images from latest lts channel for node and alpine.


  • Cross-Platform: Works on Windows 10 and macOS.
  • docker:build: Builds your Docker image, using the root Dockerfile and tags the image as latest and whatever version is specificed in package.json like 1.0.0.
  • docker:run: Run the image you built on your local Docker instance. When you run docker ps your image will identified by the imageName you specify in package.json.
  • docker:debug: Builds and runs image; tails console logs, so you can see what's happening inside the container; launches target app URL http://localhost:imagePort in a browser; all in one command.

Note that on the very first run docker:debug may fail. In this case, simple re-run the command.

  • docker:publish: Publishes the image to the imageRepo specified. This can be on Docker Hub, AWS ECS or any other Docker repository you may create.


  • Install Docker for Mac or Windows
  • npm i --save-dev cross-conf-env npm-run-all: Needed to ensure cross platform functionality for scripts.

Configuring Package.json in 2-steps

Step 1

In your package.json file add a new config property with three sub-properties using your own values, as shown below:

  "config": {
    "imageRepo": "[namespace]/[repository]",
    "imageName": "custom_app_name",
    "imagePort": "3000",
    "internalContainerPort": "3000"

Step 2

Copy & paste these new scripts under the scripts property in package.json:

Note that docker:runHelper assumes that your code is listening to port 3000 as reflected by internalContainerPort. If this is not the case, update the value in the scripts.

  "scripts": {
    "predocker:build": "npm run build",
    "docker:build": "cross-conf-env docker image build . -t $npm_package_config_imageRepo:$npm_package_version",
    "postdocker:build": "npm run docker:tag",
    "docker:tag": " cross-conf-env docker image tag $npm_package_config_imageRepo:$npm_package_version $npm_package_config_imageRepo:latest",
    "docker:run": "run-s -c docker:clean docker:runHelper",
    "docker:runHelper": "cross-conf-env docker run -e NODE_ENV=local --name $npm_package_config_imageName -d -p $npm_package_config_imagePort:$npm_package_config_internalContainerPort $npm_package_config_imageRepo",
    "predocker:publish": "echo Attention! Ensure `docker login` is correct.",
    "docker:publish": "cross-conf-env docker image push $npm_package_config_imageRepo:$npm_package_version",
    "postdocker:publish": "cross-conf-env docker image push $npm_package_config_imageRepo:latest",
    "docker:clean": "cross-conf-env docker rm -f $npm_package_config_imageName",
    "predocker:taillogs": "echo Web Server Logs:",
    "docker:taillogs": "cross-conf-env docker logs -f $npm_package_config_imageName",
    "docker:open:win": "echo Trying to launch on Windows && timeout 2 && start http://localhost:%npm_package_config_imagePort%",
    "docker:open:mac": "echo Trying to launch on MacOS && sleep 2 && URL=http://localhost:$npm_package_config_imagePort && open $URL",
    "docker:debugmessage": "echo Docker Debug Completed Successfully! Hit Ctrl+C to terminate log tailing.",
    "predocker:debug": "run-s docker:build docker:run",
    "docker:debug": "run-s -cs docker:open:win docker:open:mac docker:debugmessage docker:taillogs"

You can customize the build command to run your tests before building the image like:

    "predocker:build": "npm run build -- --prod && npm test",


You're done. Now run your scripts. To build and publish an image you only need to use two of the commands frequently. 0. npm run docker:build: Builds and Tags the image. After first run, you can just use npm run docker:debug.

  1. npm run docker:debug: Test (optional), Build, Tag, Run, Tail and launch your app in a browser to test.
  2. npm run docker:publish: Voila, your results are published on the repository you've defined.

Publish on the Internet

You've two options, easy-ish and hard.

  1. Easy-ish: Use Cloud Run Install the CLI:
$ brew cask install google-cloud-sdk
> choco install gcloudsdk

Deploy your image using the command below, replacing $IMAGE_NAME with the name of your Docker image, i.e. duluca/minimal-node-web-server:

$ gcloud beta run deploy --image $IMAGE_NAME

Follow the prompts to login, add billing information and you're good to go!

  1. Hard: Use AWS ECS This is Amazon's Elastic Container Service and it's pretty excellent to use with Docker. However, it is complicated to setup. But worry not; for step-by-step instructions head over to npm scripts for AWS ECS.
Copy link

Very helpful. Thanks for sharing!

Copy link

AmreeshTyagi commented Apr 13, 2018

Thanks. It is really helpful.

postdocker:build or predocker:build are not used in any other command.

Please help me to understand, how to use them?

Copy link

duluca commented Apr 15, 2018

@AmreeshTyagi with npm scripts, the keywords pre and post are used to execute helper scripts, respectively before or after the execution of a given script. So when docker:build is executed, predocker:build executes before hand and postdocker:build afterwards.

Copy link

@duluca Thanks a lot.
I was not aware about pre & post. It will solve lots of my issues.

Copy link

One more thing about docker-compose. Is it good to use docker-compose up instead of docker run?
I have three docker-compose files docker-compose.debug.yml for development, docker-compose.test.yml for test environment & docker-compose.yml for production.

Please suggest, if I should use three different docker compose files or environments can be handled using docker only in package.json file.

Copy link

I am following the instructions in your eBook for Angular 6 Enterprise-Ready Web Applications.

npm run docker:clean is failing on me. It is saying that my container imageName doesn't exist.

But looking at the Docker images in my computer, I see 2 images with the imageName plus colon (:) 0.1.0 and latest. What could I be doing wrong?

Copy link

Never mind. I just had to run the command again and then it worked.

First time working with Docker and with some of these dev dependency libraries.

Copy link

Thanks for sharing.
I have a quick question, I have my docker-compose in a sub-directory and from a npm command I'd like to point to this directory
in my scripts

 "docker": "docker-compose up -d --build && docker run -d my-image "
but How do I point to my docker directory where sit my docker-compose file ?

thanks for your help

Copy link

iuviene commented Nov 7, 2018

I am also new to Docker and following along in your Angular 6 for Enterprise apps book. When running 'npm run docker:build'
on a macOS, I'm getting the error 'Can't connect to the Docker daemon.' I understood that I shouldn't need to manually start the docker daemon. Any thoughts on what is likely causing this problem would be appreciated.

Copy link

A couple of typo/clarity edits can be found at my fork, @duluca, if you're interested.

Copy link

duluca commented Jun 2, 2019

@wyznaga thank you, changes incorporated

Copy link

duluca commented Jun 2, 2019

@guillermoarellano Yes, trying again works. There's a way to make it work even in the first go, but then that will mask errors that can happen on later builds. I will document the behavior.

Copy link

duluca commented Jun 2, 2019

@PhilJobCloub I have a sample docker-compose file that demonstrates how you can use docker-compose to build any dependent images. See the CONTEXT command documentation to be able to reference Dockerfiles from different folders:

Copy link

duluca commented Jun 2, 2019

@iuviene you have to make sure your Docker daemon is configured to start on machine startup through its settings but, personally I don't turn this on for my laptop to save on battery life.

Copy link

orefalo commented Mar 5, 2020

tks for sharing.

Copy link

duluca commented Mar 13, 2020

@orefalo did you use the mrm task? I'm curious how the experience of using it was

Copy link

z3rg commented Oct 29, 2020

Hi Duluca,

Thanks for the sharing, I just hit on a wall while forcing HTTPS, I tried true option and xProto its not working, so could you give an example of dockerfile? since I read from your book I think it need to renew.

Copy link

So helpful. sorry for not seen this till today!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment