Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save TheDruidsKeeper/84c58535d3c381bc352ecc0200185f13 to your computer and use it in GitHub Desktop.
Save TheDruidsKeeper/84c58535d3c381bc352ecc0200185f13 to your computer and use it in GitHub Desktop.
Azure DevOps pipeline with multi-stage docker image layer caching support
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
imageRepository: 'multi-stage-build-caching'
containerRegistry: 'ACR-Service-Connection-Name'
registryName: '[redacted].azurecr.io'
tag: '$(Build.BuildNumber)'
steps:
- task: Docker@2
displayName: "Registry Login"
inputs:
containerRegistry: '$(containerRegistry)'
command: 'login'
- script: "docker pull $(registryName)/$(imageRepository):latest-build && docker pull $(registryName)/$(imageRepository):latest"
displayName: "Pull latest for layer caching"
condition: ne(variables['USE_BUILD_CACHE'], 'false')
continueOnError: true # for the first build, tags will not exist yet
- task: Docker@2
displayName: "Build - Builder Stage"
inputs:
containerRegistry: '$(containerRegistry)'
repository: '$(imageRepository)'
command: 'build'
Dockerfile: './Dockerfile'
buildContext: '.'
arguments: '--target builder --cache-from=$(registryName)/$(imageRepository):latest-build'
tags: |
$(tag)-build
latest-build
- task: Docker@2
displayName: "Build - Final Stage"
inputs:
containerRegistry: '$(containerRegistry)'
repository: '$(imageRepository)'
command: 'build'
Dockerfile: './Dockerfile'
buildContext: '.'
arguments: '--cache-from=$(registryName)/$(imageRepository):latest-build --cache-from=$(registryName)/$(imageRepository):latest'
tags: |
$(tag)
latest
- task: Docker@2
displayName: "Push image"
inputs:
command: push
containerRegistry: "$(containerRegistry)"
repository: $(imageRepository)
tags: |
$(tag)-build
latest-build
$(tag)
latest
@girip11
Copy link

girip11 commented Jul 1, 2022

I tried similar multistage build. The build didnt honor the --target flag. It built all the stages. In this example I have only given the production to be built.
MY Dockerfile looks like

FROM python:3.8-slim as base
...

FROM base as test
....

FROM base as production

Azure pipeline task looks as below.

  - task: Docker@2
    displayName: Build and push an image to container registry
    inputs:
      command: build
      repository: $(Build.Repository.Name)
      Dockerfile: $(dockerfilePath)
      arguments: "--target production --cache-from=$(containerRegistry)/$(Build.Repository.Name):latest"
      tags: |
        $(tag)

I referred this stackoverflow thread. It does mention we cant build particular stage using the Docker@2.

Did it work for you?

@beelzebub619
Copy link

gid:SxPAgeST9kAtm32Tm5gygm

@TheDruidsKeeper
Copy link
Author

@girip11 this solution definitely did work for me, and I believe it's still in use at the company I did it for. The problem is that the issue you're running into is different from what I was solving here.

Your sample (similar to the SO thread) is expecting to skip building the test target because it's not referenced by the production final target. As the only answer on SO points out - docker doesn't support skipping stages, but if you use buildkit it looks to be possible (I haven't tried this myself).

My sample here is a workaround to MS Build Agents being ephemeral - so there are no layers cached each time a new build runs. By tagging, and pushing/pulling the latest-build tag I was able to avoid time spent rebuilding all the layers every time the build ran because docker would see the layer as not having any changes vs what was already built. Properly implemented you'll see messages in your build log on those steps saying "using cache" and the work won't be redone.

@girip11
Copy link

girip11 commented Jul 11, 2022

@TheDruidsKeeper Thanks for the reply. You are right. In my case I had to build test and production separately(ideally I wanted the test to use production as base). Using the buildkit works. I just have to set the environment variable DOCKER_BUILDKIT=1. Docker recent versions support skipping stages, but we were using custom agent pools where the docker version was old. Yes I did enable docker caching and it works. Once again thanks a lot for responding.

@TheDruidsKeeper
Copy link
Author

Great to hear 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment