Hints and tips for Azure DevOps Pipeline

Overview

In this article I would like to give some good examples of azure pipeline features that can be difficult to implement for the first time and are very often used in many other variants.

Pass matrix params into condition

I like to use the strategy matrix in azure pipeline because it shortens your code and generates jobs in pipeline. Tricky part is to pass information from this matrix into specific places, e.g. condition in task definition. I had this matrix definition on top of my pipeline:

strategy:
  matrix:
    python_package_one:
      python.version: "3.9"
      package.name: "python_package_one"
      packageConfiguration: private
    python_package_two:
      python.version: "3.9"
      package.name: "python_package_two"
      packageConfiguration: public

Then I would like pass information about which version of python to use:

steps:
  - task: UsePythonVersion@0
    inputs:
      versionSpec: "$(python.version)"
      addToPath: true
    displayName: "Use Python $(python.version)"

And then I would like to have a specific condition for private or public packageConfiguration. Passing parameters or variables works fine with specified parameters or variables, but how to achieve the same with a matrix strategy?

variables:
- name: testEmpty
  value: ''

parameters:
- name: doThing
  default: true
  type: boolean

steps:
    condition: and(succeeded(), eq('${{ parameters.doThing }}', 'true'))
    ...
    condition: eq(variables.testEmpty, '')
    ...

The answer is simple, the strategy matrix generates not only jobs for you, but also variables. For each entry in the strategy matrix it will be different, but it can be used for a condition:

strategy:
  matrix:
    python_package_one:
      python.version: "3.9"
      package.name: "python_package_one"
      packageConfiguration: private
    python_package_two:
      python.version: "3.9"
      package.name: "python_package_two"
      packageConfiguration: public

steps:
    condition: and(succeeded(), eq(variables.packageConfiguration, 'private'))
    ...
    condition: and(succeeded(), eq(variables.packageConfiguration, 'public'))
    ...

Official documentation jobs-job-strategy

Use stage with parallel run

The last time I tried to use multiple stages, for some reason it did not work as expected. As expected, when I choose one package should only trigger one, choosing up two packages should trigger two. And in any case, the stage should not wait for any dependencies and run parallel. However it did not work that way.

trigger:
  - master

pool:
  vmImage: "ubuntu-latest"

parameters:
  - name: python_package_one
    type: boolean
    default: false
  - name: python_package_two
    type: boolean
    default: false

stages:
  - stage: python_package_one
    condition: eq('${{ parameters.python_package_one }}', true)
    ...
  - stage: python_package_two
    condition: eq('${{ parameters.python_package_two }}', true)
    ...

Then I found devops-cicd-parallel-stages-deployment and after small change it worked. Conclusion: if you want to want to run parallel stages, keep in mind to specify an empty condition in dependsOn parameters.

stages:
  - stage: python_package_one
    dependsOn: '' # Without a value. This will let ADO know the stage can execute w/o any dependencies.
    condition: eq('${{ parameters.python_package_one }}', true)
    ...
  - stage: python_package_two
    dependsOn: '' # Without a value. This will let ADO know the stage can execute w/o any dependencies.
    condition: eq('${{ parameters.python_package_two }}', true)
    ...

Loop in azure devops

Used files for making templates works in azure. loop-template.yaml.

steps:
  - ${{ each project in parameters.projects }}:
      - bash:
          echo ${{ project }}
          my_arr=($(echo ${{ project }} | tr ":" "\n" | tr "/" "\n"))

          FOO_1=${my_arr[1]}
          FOO_2=${my_arr[2]}

          echo "##vso[task.setvariable variable=image_name]$FOO_1"
          echo "##vso[task.setvariable variable=image_tag]$FOO_2"
        displayName: 'Working ${{ project }}'

      - template: pull-scan-push.yaml
        parameters:
          docker_image_current_tag: "$(image_tag)"
          docker_repository: "$(image_name)"
          docker_base_name: "docker.artifact.com/companyname"

Main ci.yaml file which call whole loop structure, and all steps can be run in parallel way. And top of that is easy to support, for example, task was to be able to specify tag for docker images (latest was to close to development and production was to far for testing), so when they have specific version of number they want to have small file to change all numbers in one place.

Main idea behind this task was to pull specific version of docker images, scan them for vulnerabilities if everything is fine then push them into next artifactory for another processing (mostly related to prod deployment).

trigger:
  paths:
    include:
      - "/pipelines"

variables:
  - name: system.debug
    value: true
  - group: docker-login    # Login credentials information

stages:
  - stage: pull_and_scan_and_push  # Pull docker images, scan them and push into another artifactory
    pool:
      vmImage: "ubuntu-latest"
    jobs:
      - job:
        steps:
          - template: templates/loop-template.yaml
            parameters:
              projects:
                - companyname/image-backend:v3.5.0
                - companyname/image-dashboard:v3.5.0
                - companyname/image-vm:v3.5.0
                - companyname/image-dashboard-client:v3.5.0
                - companyname/image-task-manager:v3.5.0
                - companyname/image-email-service:v3.5.0
                - companyname/image-statistics:v3.5.0

I hope this information’s are useful to your projects and uses cases.

Michal Slovík
Michal Slovík
Java developer and Cloud DevOps

My job interests include devops, java development and docker / kubernetes technologies.