Breaking the Cycle of Fragility: Setting Up Unit Tests and Code Coverage in Azure DevOps…Yet Again
Unit tests and code coverage reports are essential components of a robust CI/CD pipeline. Azure DevOps (AzDO) provides an intuitive interface for visualizing test results and code coverage for every pipeline execution. This not only enhances build validation pipelines but also empowers code reviewers with crucial data points, boosting their confidence in the quality of the codebase they are reviewing.
However, setting this up can be tricky, requiring a precise configuration to ensure everything works seamlessly. For some reason getting this setup seems so brittle to me. I find myself solving this problem over and over again and it is not easily fixed…from memory alone…at least, my memory.
In this article, we’ll walk through the process of configuring unit test execution and code coverage reporting in AzDO pipelines, complete with detailed YAML snippets, explanations, and screenshots to guide you through each step.
Setting Up the Pipeline for .NET 8.x
The first step is to configure the pipeline to use the latest .NET 8 SDK. This ensures that the tools required for building, testing, and running your code are available during the pipeline execution.
- task: UseDotNet@2
inputs:
packageType: 'sdk'
version: '8.x' # Use the latest .NET 8 SDK
installationPath: $(Agent.ToolsDirectory)/dotnet
Here, the UseDotNet@2 task is used to install the .NET SDK. The packageType specifies the type of package to install (in this case, the SDK), and the version ensures compatibility with your project. The installationPath specifies where the SDK will be installed on the agent machine. Restoring Dependencies
Next, we restore all project dependencies. This is a critical step to ensure that all required NuGet packages are available before building the solution.
- task: DotNetCoreCLI@2
displayName: Restore dependencies
inputs:
command: 'restore'
projects: |
src/dotnet/Foo/Foo.csproj
src/dotnet/Foo.Tests/Foo.Tests.csproj
The DotNetCoreCLI@2 task runs the restore command, which downloads the dependencies specified in the .csproj files for the main project and its corresponding test project. This step ensures that the build and test tasks will have access to all required libraries.
Building the Solution or Project
After restoring dependencies, the next step is to build the project. This validates that the code compiles correctly before running any tests.
- task: DotNetCoreCLI@2
displayName: Build solution
inputs:
command: 'build'
projects: |
src/dotnet/Foo/Foo.csproj
src/dotnet/Foo.Tests/Foo.Tests.csproj
arguments: '--configuration $(buildConfiguration)'
Here, the build command compiles the main project and test project in the specified configuration (Debug or Release). This ensures that all artifacts are ready for the test phase.
Running Unit Tests and Collecting Code Coverage
This is where the pipeline begins to showcase its power by running unit tests and collecting code coverage data.
- task: DotNetCoreCLI@2
displayName: Test
inputs:
command: test
projects: '**/*Tests/*.csproj'
arguments: '--configuration $(buildConfiguration) --collect:"XPlat Code Coverage" -- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=cobertura'
workingDirectory: $(Build.SourcesDirectory)
In this task, the test command executes all test projects matching the */Tests/*.csproj pattern. The –collect:”XPlat Code Coverage” argument enables code coverage collection, and – DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=cobertura ensures that the output is in the Cobertura format. This generates a coverage.cobertura.xml file, which is essential for displaying code coverage in AzDO.
I highlighted an important output. This is the path where the “coverage.cobertura.xml” file is placed. The next step is going to look for this file — or rather files with this name within any nested sub-folder of the Temp Directory.
Publishing the Code Coverage Results
Once the coverage.cobertura.xml file is generated, the next step is to publish the results so they can be visualized in the AzDO pipeline UI.
- task: PublishCodeCoverageResults@2
displayName: 'Publish code coverage'
inputs:
codeCoverageTool: Cobertura
summaryFileLocation: '$(Agent.TempDirectory)/**/coverage.cobertura.xml'
The PublishCodeCoverageResults@2 task takes the generated coverage.cobertura.xml file and integrates it into the pipeline UI. The $(Agent.TempDirectory) environment variable corresponds to /home/vsts/work/_temp in the pipeline, ensuring the file is correctly located. This step enables you to see the code coverage details in a dedicated section of the pipeline run.
What Success Looks Like
When everything is set up correctly, you’ll see two key reports in your pipeline execution: Unit Test Outcome Report
A detailed view of which tests passed or failed, including metrics such as the total number of tests run and the duration.
Code Coverage Output Report
A visual representation of code coverage, highlighting the percentage of code covered by the tests and identifying untested areas.
Conclusion
Setting up unit test execution and code coverage integration in Azure DevOps pipelines might seem tedious, but the effort is well worth it. The rich UI and detailed reports provide invaluable insights for build validation pipelines, empowering developers and reviewers with the data they need to maintain high-quality codebases.
By following these steps and carefully configuring your YAML, you can ensure that your pipeline delivers actionable test results and coverage metrics every time. With this setup, you’ll never have to wrestle with brittle configurations or rely solely on memory again.