<< ALL BLOG POSTS

Building an Azure DevOps Pipeline for Django

Table of Contents

In this post we are going to show how to create a CI pipeline for a Python/Django-based web application using Azure DevOps. We will present the tasks the pipeline executes, how it was set up, and how components such as Variable Groups and Azure Key Vault are used to pass variables and secrets to the pipeline.

What is Azure DevOps?

Azure DevOps is one of Microsoft’s software collaboration suites. It’s similar to other popular suites including GitHub, Gitlab, and Bitbucket as it provides organization and projects support, code repositories, issue management trackers, continuous integration and continuous delivery (CI/CD) tools.

The documentation is up to date and very polished in the majority of cases. Resources are created and operate fast and the overall experience is very smooth.

Azure Pipelines is the part of Azure DevOps that handles CI/CD. It enables teams and developers to continuously build, test, and deploy their applications in Azure but also in any other platform or cloud environment as well.

What is the purpose of our CI Pipeline?

The CI pipeline can be considered as the workflow that defines what will be built and tested automatically upon each commit or pull request. In short, it will build the containers, run all automation tests (unit tests, linters, formatters, security checks) and even push the containers to our registry in specific cases and merge to the main branch.

We are working with a Django application connected to a PostgreSQL database that is running a number of workers, Celery and Celery Beat, plus a number of services that include Redis. It is deployed in production and runs via a Docker Compose file.

Development and staging environments are also powered through respective Docker Compose files, and there’s an attempt to keep changes between these files minimal so that what gets deployed in production is very close to what is deployed on a development environment.

We want the CI pipeline to perform the following actions:

  1. Build all Docker images the application is using
    This step is a prerequisite and will either confirm that all images can be built, or fail. For example, this will fail if there is a Python library version update that conflicts with what another library requires. Since we are using Poetry, there are few chances this can happen.
  2. Start the application
  3. Run checks on all commits
    This includes unit tests on the Django application, linters, formatters (black and flake8), security checkers, and test coverage production.
  4. Tag and push produced Docker images
    If any of the steps fails, the pipeline exists. The last step is conditional and runs only when there’s a merge to main branch.

The pipeline is powered through a single YAML file, azure-pipelines.yml.

Stages versus Jobs

The pipeline is simple enough to only require jobs and thus we won’t create any stage. Stages would make sense if we wanted to set a logical boundary (e.g., BUILD vs DEPLOY stages), but this is not needed for now.

For serial jobs, we will specify the dependsOn keyword that will make Azure pipelines understand not to start a job before its dependencies have run. Moreover, if we want to ensure that jobs will be run on the same Azure agent, we will have to specify the AgentName.

By default, jobs can run in parallel and might be picked by different agents. While this might be ok for some applications, it is not accepted if a job expects to use results from a previous job (e.g., files located on the filesystem). The detailed instructions for how jobs are run gives good hints related to how to design a pipeline.

So now jobs are defined in azure-pipelines.yml like this :

jobs:

- job: pre_init_steps
  displayName: Pre initialization steps
  steps:
  - script:
     ...

- job: build_test_steps
  displayName: Build, run tests and linters
  dependsOn: pre_init_steps
  steps:
  - task: AzureCLI@2
    displayName: Run first task
    ...
  - task: AzureCLI@2
    displayName: Run more tasks
    ...

- job: post_build_steps
  displayName: Tag and push to Registry steps
  dependsOn: build_test_steps
  condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
  steps:
  ....

We can see how specifying that build_test_steps depends on pre_init_steps and post_build_steps depends on build_test_steps will set the pipeline jobs in the following series:

pre_init_stepsbuild_test_stepspost_build_steps

Also, the condition keyword on the last job ensures that it will only run in the case where commit is happening in main branch, which is the result of a pull request merge to main, because main branch is protected from receiving direct commits.

azure4.png

Azure Pipelines setup

At this step we must have a DevOps subscription and a project with code repository. Take note of the subscription identifier, as it will need to be passed to the pipeline file as the azureSubscription variable. Now we are ready to create the following resources:

  • a Resource Group in Azure, which is a container that holds related resources for an Azure solution (keeps things well organized),
  • a Container Registry, since we are using it to push created containers. We take note of the credentials to login since we want to pass them to the pipeline as well (to be able to push images),
  • a Key Vault, which can optionally be used to store secrets and pass them to the pipeline.
  • A Service Connection that will allow the pipeline to connect to resources including the Microsoft Azure subscription; and
  • one or more Variable Groups inside the Pipelines Library section. We enter all values that need to be passed to our application (example Timezone, Email, any Django or application-related variable that need be present as an environment variable). Any Variable Group that we create can be specified to the Pipeline YAML file.
variables:
- group: EnvironmentVariables
- group: EnvironmentVariables-Secret

All name/value pairs that are enabled for a pipeline will be available as operating system environment variables to the application. We suggest you create a group with Variables that are application specific but without containing secrets, and another one with these secrets.

Variable groups can also get their key/values from a specified Key Vault, with the option link secrets from an Azure key vault as variables.

Through the Security tab in Variable Groups, we can limit who can see these variables among the accounts that have access to the project/repository. We might want the pipeline to read them without some accounts/groups being able to access them.

After a Variable is created, we have to enter the Pipeline Permissions tab and grant access to the pipeline to use these variables.

azure3.png

Create Azure Pipeline

Having created the resources above, and with file azure-pipelines.yml merged in main branch, we can go to the Pipelines section and hit New Pipeline. We select Azure Repos Git on the question “Where is your repo,” then select the repository from the list of repositories, then select Existing Azure Pipelines YAML file from the options provided to the section “Where is your pipeline.” On the dropdown list that appears, we specify the main branch and the azure-pipelines.yml file and hit continue.

On the next page we are able to see the pipeline and hit “Run.” This will start running and depending on whether the service connection was set previously, it will notify us that the Pipeline needs permissions to access resources before it can continue.

Once we view and allow permissions, the pipeline now runs!

azure2.png

Azure Pipeline topics

Here are a few other things to keep in mind as you set all this up.

When the pipeline gets triggered to run
By default, the pipeline runs after all commits are made, but based on our scenario we might choose to limit the events, e.g. run only when commits are made to specific branch names or event types only.

trigger:
  branches:
    include:
    - '*'

Azure agent hosts vs. self-hosted agents

By default, pipelines run on Azure’s infrastructure and if we don’t specify the vmImageName parameter, they run on ubuntu-latest. This is the default behavior, and this can be omitted from the pipeline:

pool:
  # Agent VM image name
  vmImageName: 'ubuntu-latest'

In most cases that is ok, but there are cases where we want to set up our self-hosted agents:

  • We want the pipeline to run on a specific OS that is not among the provided Azure agents
  • The pipeline runs some tests that call internal services, so they are not accessible through Azure, but are accessible through self-hosted agents that have access to these services.

Of course, by using self-hosted agents we are missing the nice automatic availability that Azure provides and runs our jobs. In most cases new agents are started to run each job, so if we are using a limited number of self-hosted agents, special care might have to be taken to clear an environment once it is run so that it does not create problems for future runs.

Set variables early on the pipeline
It is worth setting a number of variables if these are going to be used many times during the pipeline:

variables:
  - name: tag
    value: '$(Build.SourceVersion)'
  - name: azureSubscription
    value: 'subscription_id'
  - name: docker_images_prefix
    value: 'organization_registry.azurecr.io/prefix'

Run job without checking out the repository

By default, when a job is running it will check out the repository first. In case we don’t want this to happen, we can use the checkout: none option, as on example_b here:

- job: example_a
  displayName: Code is checked out when this job starts
  steps:
  - script:
    do_something

- job: example_b
  displayName: Code is not checked out when this job starts
  steps:
  - checkout: none # this is run without having checked out the repository
  - script:
    do_something

Complete Pipeline: azure-pipelines.yml

Finally, the complete pipeline for our project is now:

azure-pipelines.tml

trigger:
  branches:
    include:
    - '*'

variables:
  - group: EnvironmentVariables
  - group: EnvironmentVariables-Secrets
  - name: tag
    value: '$(Build.SourceVersion)'
  - name: azureSubscription
    value: 'subscription_id'
  - name: docker_images_prefix
    value: 'custom.azurecr.io/prefix'

jobs:
- job: pre_init_steps
  displayName: Pre initialization steps
  steps:
  - checkout: none # this is run without having checked out the repository
  - script:
      echo "do nothing for now"
    displayName: 'Environment pre initialization steps'

- job: build_test_steps
  displayName: Build, run tests and linters
  dependsOn: pre_init_steps
  steps:
  - task: AzureCLI@2
    displayName: Build container images and start docker compose
    inputs:
      azureSubscription: $(azureSubscription)
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: |
        docker-compose -f docker-compose-development.yml build
        docker-compose -f docker-compose-development.yml up -d

  - task: AzureCLI@2
    displayName: Run formatters, linters, checkers
    inputs:
      azureSubscription: $(azureSubscription)
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: |
        docker-compose -f docker-compose-development.yml exec -T backend black . --check
        docker-compose -f docker-compose-development.yml exec -T backend python manage.py makemigrations --check --dry-run
        docker-compose -f docker-compose-development.yml exec -T backend pytest --cov --cov-report=html --cov-config=.coveragerc
        docker-compose -f docker-compose-development.yml exec -T backend flake8 .

- job: post_build_steps
  displayName: Tag and push to Registry steps
  dependsOn: build_test_steps
  condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
  steps:
  - checkout: none
  - task: AzureCLI@2
    displayName: Tag and push to Registry
    inputs:
      azureSubscription: $(azureSubscription)
      scriptType: bash
      scriptLocation: inlineScript
      inlineScript: |
        docker login $(REGISTRY_URL) --username $(REGISTRY_USER) --password $(REGISTRY_PASSWORD)
        docker tag "$(docker_images_prefix)_web" "$(docker_images_prefix)_web":latest
        docker tag "$(docker_images_prefix)_web" "$(docker_images_prefix)_web":$(tag)
        docker image push --all-tags "$(docker_images_prefix)_web"

azure.png

Recap

We’ve created a simple yet robust pipeline on Azure DevOps Pipelines that can be easily extended at any time if new services are added or different sets of tests are requested. The pipeline is a big step toward ensuring code consistency and enforcing existing good practices without introducing issues.

Related Posts
How can we assist you?
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.