Part 1: Go, Docker und self-hosted build agents
What is Azure DevOps?
Azure DevOps is a web platform from Microsoft that provides tools for various areas surrounding IT projects:- Azure Boards for project management
- Azure Pipelines for CI/CD
- Azure Repos for source code management
- Azure Test Plans for manual testing
- Azure Artifacts for artifact management
The tools work hand-in-hand, for example work items from Azure Boards can be linked to pull requests in Azure Repos. Before pull requests can be merged, a pipeline in Azure Pipelines must confirm the correctness of the code and finally it loads an artifact into Azure Artifacts.
In this blog post series, we will only use Azure Repos and Azure Pipelines.
The first CI Pipeline– Go und Docker
Our first use case is a microservice written in Go that shall be deployed using Docker. We will create a CI pipeline that will do the:- Build and test the microservice
- Build a Docker Image
- Upload the Docker image to a Docker Registry
Now for the exciting part: the CI pipeline. In Azure DevOps, the executable part of a pipeline consists of stages, a stage consists of jobs, and a job consists of steps. For our simple case, a stage with one job is quite sufficient. The functionality of a step is described by a task, for example there is a Go task, a Docker task and a Git checkout task. The Bash and Powershell task even provide us the possibility to execute custom scripts. For even more complex cases, there is also the possibility to develop your own tasks in TypeScript. Since the build process is already completely defined in the Dockerfile, we only need docker build and docker push as build steps. For this, we use the Docker task.
Besides the actual build process, we can define in the pipeline
- for which events the pipeline should be triggered automatically
- which variables and variable groups are to be used
- whether the pipeline should be parameterized
- whether additional git repositories should be checked out
- $(Build.SourcesDirectory), the path on the build agent where the git repo is checked out, as a path prefix for accessing files.
- $(Build.Repository.Name), the name of the git repo, as the Docker image name.
- $(Build.SourceVersion), the commit hash as the Docker image tag.
We have specified ‘docker-hub’ as the target container registry here which is a reference to a so-called service connection. A service connection in general describes a connection to an external service. This allows the pipeline to use it without having to store any credentials directly in the pipeline. To create a new service connection, we go to the “Project Settings” page and there go to “Pipelines” → “Service Connections”. Here we create a connection to a Docker Registry on Docker Hub. The prerequisite for this is a (free) Docker account. Of course, any other Docker registry could also be used.
Next, we need to enter our credentials for the Docker Hub account. Important: you have to create an access token on Docker Hub beforehand.
After committing all the files in git, all we need to do is create the pipeline in the Azure DevOps interface, pointing it to our azure-pipelines.yaml file. To do this, we go to “Pipelines” and then click on “Create Pipeline”:
Our azure-pipelines.yaml file resides in Azure Repos:
After selecting our git repo, Azure DevOps automatically recognizes our azure-pipelines.yaml because it is the only YAML file in that repo.
A well-considered click on “Run” and we can finally see the fruits of our labor:
As we can see the Docker image was built and automatically uploaded to Docker Hub:
A quick test in a local shell confirms that everything worked fine and the Docker image can now be pulled from anywhere:
Unit Tests and Code Coverage
Automated testing is part of every good pipeline. That’s why we will now add a unit test to our Go project and a step in the pipeline that executes the test. If some tests fail, the pipeline should terminate, and the Docker image should neither be built nor pushed. Our test starts an HTTP request to the standalone server and checks the response:
Locally, the test works already:
Next, we add the test execution to the pipeline. Azure DevOps provides two predefined tasks for Go projects: GoTool and Go. With GoTool we select the Go version for the pipeline, with Go we can run Go commands. Before running the tests, we build our Go project. Although this wouldn’t be strictly necessary, it does help in debugging whether an error occurs during the build (syntax error) or only when running the tests (semantic error). The build itself requires two steps: go mod download to download the libraries and go build to compile. The tests are then executed using go test. Now we are ready to extend the pipeline as the following:
After committing and pushing in git, the pipeline should automatically start, build the application, and execute the test:
We’re beginning to get a feel for CI and how to implement it in Azure DevOps. To make sure that the negative case also works, we now change the code so that the test fails:
As expected, the pipeline fails and aborts before the Docker image is built:
However, to look up specifically which test failed and why, we need to look in the logs. For a single test this is not a problem, but if we have hundreds of tests, we don’t have the time to scroll through thousands of log lines to find the test that failed. We also don’t immediately see what percentage of tests failed. Fortunately, Azure DevOps provides an interface here to provide test results in JUnit XML format. To make use of this feature, we need to convert the output of go test into this format. Luckily, someone else has already done this work for us and written a corresponding go tool: https://github.com/jstemmer/go-junit-report. We are also interested in test coverage. There is also an interface from Azure DevOps and ready-made tools for converting it to the right format.
For this whole complex process, we create a bash task which will do the following: first it will download the necessary tools, then it will run the tests, remembering the return code for later. This is because we want to use the return code of go test as the return code of the whole step, so that Azure DevOps knows whether the step failed or not. But before that, we need to prepare the report and coverage, both in case of success and failure. Afterwards, we add the two tasks PublishTestResults and PublishCodeCoverageResults to the pipeline. Here it is important to add condition: succeededOrFailed(). Normally, subsequent steps are not executed if a step fails (i.e. the default value is condition: succeeded()) but with condition: succeededOrFailed() they are executed even if previous steps failed, but unlike condition: always()not if the pipeline was manually aborted.
Side note if the builds are going to be run on a self-hosted build agent: the PublishCodeCoverageResults task expects the build agent to have a .NET runtime installed.
Here now the finished pipeline:
After a successful run of the pipeline, we now see the test results and coverage graphically displayed in two new tabs:
Similarly, in case of a failed test, we see an accurate message about the cause:
Conclusion
Azure DevOps offers a good way to create pipelines quickly and conveniently. The graphical user interface is simple and easy to understand and is recommended for beginners. For this purpose, creating pipelines via the graphical user interface is a good option. Connecting to external services is also simple and quick. Nevertheless, the change to YAML syntax for pipelines also makes it suitable for more complex use cases. This allows you to define pipelines with almost unlimited complexity. In this blog post, we have only looked at a fraction of the features of Azure DevOps. In part 2 we will- create another Go project and another pipeline as a dependency in our first Go project.
- create a pipeline template to reuse for multiple pipelines
- add an intelligent versioning algorithm according to Semantic Versioning to the pipeline.