Software Testing Quality Assurance using xUnit, TestCafe along with Docker and Jenkins: an automated approach 2/3

This is the second of three posts about QA Testing using xUnit, TestCafe, Docker and let’s add Jenkins to give it an automated approach. For those who missed the previous post, it’s worth seeing it since most of the steps we are going to do here have some pre-requisites we covered there.

What is, what is used for and why Jenkins?

Jenkins logo

Jenkins is an open-source automation platform with lots of plugins designed for Continuous Integration or Continuous Delivery (CI/CD) purposes written in Java. Jenkins is used to constantly create and test software projects, making it easier for developers to incorporate project modifications and making it easier for users to acquire a new build. It also enables you to continuously deliver your applications by integrating with a wide range of testing and deployment technologies.

Continuous Integration / Continuous Delivery

Continuous Integration or Continuous Delivery is a software development practice in which developers are expected to commit changes many times a day or more regularly to the source code in a shared repository. Each commit made in the repository is then compiled. This helps the teams to identify the issues early. Besides this, many other roles depending on the CI/CD tool, such as deploying the build application on the test server, delivering the build and test results to the affected teams.

Besides all of that technical definition, Jenkins is this distinct gentleman in the image that came to serve you and make your life as a developer better. Let’s not take it for granted and make the most of Jenkins.

jenkins plugins

We all know that CI/CD is one of DevOps’ most important parts used to integrate various DevOps stages. For those who have no clue about what DevOps means, it is a set of practices integrating the development of software and IT activities. It aims to shorten the life cycle of system creation and provide high software quality for continuous integration and delivery.

With Jenkins, through automation, not only individual developers but also companies can accelerate software development. Jenkins incorporates all sorts of life-cycle development processes, including documenting, building, testing, packaging, staging, deployment, static analysis, and even more.

Jenkins’ benefits include:

  • With great community support, it is an open-source application.
  • It is straightforward to install using Jenkins docker’s official image.
  • To simplify your job, it has 1000+ plugins. You should code it and share it with the group if a plugin doesn’t exist.
  • It is cost-free.
  • It is Java-built and is, therefore, portable to all major platforms.

CI/CD with Jenkins depicted

Imagine a situation where the application’s full source code has been built and then deployed for testing on the test server. It sounds like the ideal way to build software, but there are several pitfalls in this process. I’m going to try to clarify them:

  • For the test results, developers have to wait until the full software is developed.
  • There is a strong probability that several bugs will be found in the test results. It is difficult for developers to find these bugs since they have to review the entire source code.
  • It delays the delivery process of applications.
  • There is a lack of continuous feedback on coding or architectural problems, build failures, test status and file release uploads, due to which software quality will decline.
  • The method as a whole is manual, raising the likelihood of frequent failure.

From the above-mentioned issues, it is evident that the software delivery process is sluggish, and software quality will likely deteriorate with time. This leads to disappointment with customers. So there is a desperate need for a system to exist to solve such frustration where developers would constantly activate a build and test for any change made in the source code. This is what CI/CD it’s all about. Jenkins is one of the most mature CI/CD platforms available, so let us see how Jenkins conquered the deficiencies in a simplified and self-explanatory workflow diagram below as an example.

simplified CI diagram with Jenkins

As you can see, Jenkins uses the pipeline concept to unleash the power of multiple steps according to parameters you set to perform both basic and complex tasks. When built, pipelines can build code and coordinate the work needed to push applications from commit to delivery.

Let’s make things happen now!

A useful hint I am going to give you is to use docker-compose as much as you can. Docker-compose is a tool for describing Docker multi-container applications and running them. With docker-compose, you use a YAML file to configure services for your application. Then, you build and start all the services from your configuration with a single command. Besides all of that, the YAML file is easy to read, and you will treat it as a source code and track all changes with your preferred SCM tool. Docker-compose is installed as part of the Docker Desktop. If you haven’t installed Docker Desktop yet, go to our first post of a series of four and follow the instructions.


Since I am using Windows, go to Powershell or Windows Terminal and create jenkins directory by running New-Item -Type Directory -Path jenkins command, as shown below.

Jenkins folder

From inside Jenkins directory type code . and hit Enter to open an instance of VSCode.

open vscode from iside jenkins folder

Create a file called docker-compose.yaml inside Jenkins folder.

docker-compose yaml file

Copy this code below into docker-compose.yaml file and save it.

version: '3.8'
    image: sitk/jenkins:lts
    container_name: sitk_jenkins
      - 8080:8080
      - 50000:50000
      - jenkins-home-volume:/var/jenkins_home
      driver: local
        type: none
        device: c:\sitk\jenkins\jenkins_home
        o: bind

It should look like this.

docker-compose yaml file filled up

YAML is a format for human-readable data serialization that can be used for all programming languages in combination and is also used to write configuration files. It is imperative to keep this file valid, which means that we need to follow some rules and indentations. Therefore I recommend using this popular VSCode extension with more than 4.6MM installs so it can help us to validate the file.

YAML Red Hat VSCode extension

We will use the official Jenkins Docker image jenkins/jenkins:lts and extend it installing .NET Core SDK since we will restore, clean, build and unit test the eShopOnWeb solution.

Docker Jenkins

Building a new Jenkins Docker image

Let’s build the sitk/jenkins:latest image. Create the image folder inside the Jenkins folder, and create a Dockerfile file inside the image folder with the content shown below.

FROM jenkins/jenkins:lts
 # Switching to root user to install .NET Core SDK
USER root

# Show the distro information during compilation!
RUN uname -a && cat /etc/*release

# Based on instructions at
# Installing with APT can be done with a few commands. Before you install .NET SDK, run the following commands 
# to add the Microsoft package signing key to your list of trusted keys and add the package repository.
RUN wget -O - | gpg --dearmor > microsoft.asc.gpg && \
    mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/ && \
    wget && \
    mv prod.list /etc/apt/sources.list.d/microsoft-prod.list && \
    chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg && \
    chown root:root /etc/apt/sources.list.d/microsoft-prod.list

# Install the .Net Core SDK, set the path, and show the version of core installed.
RUN apt-get update && \
    apt-get install apt-transport-https && \
    apt-get update && \
    apt-get install -y dotnet-sdk-3.1 && \
    export PATH=$PATH:$HOME/dotnet && \
    dotnet --version

# Switching back to the jenkins user.
USER jenkins

In the end, it should look like this.

sitk Jenkins Dockerfile

Now we will build the sitk/jenkins:latest with .NET Core SDK. Create a file into the image folder with this command: docker build -t sitk/jenkins . . It should look like this.

sitk Jenkins image build shell script

Go to Windows Terminal, open a Linux terminal. In my case, I have Ubuntu 20.04 running as Windows Subsystem Linux version 2. Go to /mnt/c/sitk/jenkins/image directory and run ./ . If you missed it and want to catch up, go to our first post of a series of four.

After compiling the image, you should see something like this.

sitk jenkins image compilation

Now we are ready to run docker-compose. Go back to the Windows Terminal and type docker-compose up -d and hit Enter. I recommend saving the command in a start.ps1 file, so you don’t need to remember the syntax next time.

cicd service up and running

This command will start a container in detached -d mode with the service cicd which is based on the sitk/jenkins:lts image,…

docker-compose image

…it will name the container sitk_jenkins,…

docker-compose container name

…it will map the container inside ports 8080 and 50000, which are sitting at the right side of the colon, with the same ports at the host, which are sitting at the left side of the colon. Just for curiosity, port 50000 is used in robust environments when you need to attach some Jenkins Slave Agents to run pipelines in parallel. Mapping port 8080 to host is important so we can access the service via http://localhost:8080

docker-compose ports

…and the volumes declaration maps the /var/jenkins_home directory where all Jenkins data resides inside the container, to the c:\sitk\jenkins\jenkins_home in the host. This mapping is essential to manage it and attach it to another container for upgrades purposes, backup Jenkins data. Most importantly, it will also survive the container stop/restart/deletion, which means no Jenkins data loss.

docker-compose volumes

Go back to Windows Terminal, type docker ps -f "name=sitk_" and hit Enter. If everything went right, you would see Jenkins container running.

show Jenkins container running

We are almost there, now go to http://localhost:8080, you will see something like this.

Unlock Jenkins

Follow the instructions on the screen. Remember when we mapped the container /var/jenkins_home directory with the host directory c:\sitk\jenkins\jenkins_home in the YAML file? That’s another useful applicability of docker bind volumes. You don’t need to go inside the container to access files. Therefore, open c:\sitk\jenkins\jenkins_home\secret\initialAdminPassword file, copy the password and paste it to the administrator password field and click continue.

Click on the Install suggested plugins button.

Jenkins Install suggested plugins

And wait a couple of minutes for Jenkins to download and install all plugins.

Installing jenkins plugins.jpg

After the installation of the plugins, they will prompt you to create the First Admin User.

Jenkins first admin user

Type the information required and hit Save and Continue.

fulfill all required info for the Jenkins first admin user

Let the Jenkins URL unchanged (http://localhost:8080/) and hit Save and Continue.

Jenkins instance configuration

Now we are done with Jenkins’s initial setup.

Jenkins is ready

Click on Start using Jenkins button, and there you go!

Jekins login

Before we start creating pipelines for the suggested Jenkins Workflow graph, let me share with you an important hint. Create a Powershell script to stop and remove the Jenkins container with docker-compose rm --stop --force command, name it as stop.ps1, and save it in the same directory you saved the start.ps1 PowerShell script previously. Yes, you heard it right, the command will stop and remove the container and don’t be afraid of doing so because all we need for Jenkins to work next time we start it is in the c:\sitk\jenkins\jenkins_home directory, remember?

So, let’s do this. Go to Windows Terminal under c:\sitk\jenkins directory, type .\stop.ps1 to stop Jenkins’s container.

stop jenkins container

Let’s start it over again with the start Powershell script you previously created.

starting jenkins container

Let’s make sure that Jenkins is up and running because we will need it soon. Visit http://localhost:8080 and enter the admin user credentials you previously created and sign in.

Jekins login

Since we will check out from GitHub restore, clean, build and pack it, we will need .NET SDK Support plugin for Jenkins. Go to Manage Jenkins left panel option, click on Manage Plugins.

MAnage Jenkins manage plugins

Click on the Available tab, and in the search field, type “dotnet” and select  .NET SDK Support and click on the Install Without Restart button.

install NET SDK Support plugin

Please wait until it is completely installed.

NET SDK Support installed

Now we are ready to create the pipeline with some of the stages described in the Jenkins Workflow graph. Let’s use the eShopOnWeb, an ASP .Net Core application we used on posts three and four of our series of four posts. It is worth looking at them, but you can always visit our eShopOnWeb Github repository for more details.

On Jenkins’s home page, select New Item.

Jenkins New Item

Enter eShopOnWeb as the item name, select Pipeline and click OK.

enter intem name select pipeline and click ok

We suggest you give it a brief description and click on the Pipeline tab.

basic cicd pipeline for eShopOnWeb solution

Select the Pipeline script from SCM option from the Definition field.

select pipeline script from SCM

Select the Git option from the SCM field.

select git from SCM field

Copy eShopOnWeb git URL to the Repository URL field, and make sure the Script Path field is “Jenkinsfile,” and click on the Save button.

Copy eShopOnWeb git URL to the Repository URL field

Now we have to prepare Jenkinsfile because it will contain all stages that Jenkins will run in the pipeline. Therefore, go to the eShopOnWeb root folder, create the Jenkinsfile and copy the content below.

pipeline {
    agent any

    stages {
        stage('Checkout') {
            steps {
                git url: '', branch: 'master'
        stage('Restore') {
            steps {
                sh "dotnet restore eShopOnWeb.sln"
        stage('Clean') {
            steps {
                sh "dotnet clean eShopOnWeb.sln"
        stage('Build') {
            steps {
                sh "dotnet build --configuration Release eShopOnWeb.sln"
        stage('Functional, integration and unit tests (xUnit)') {
            steps {
                sh "dotnet test eShopOnWeb.sln"

It should look like this.

Jenkinsfile eShopOnWeb

Don’t forget to rename the eShopOnWeb git URL if you forked or git cloned the project. You most probably have to change in this URL “” only the sitknewnornal highlighted in red to your GitHub username. And most importantly, don’t forget to commit and push it to your repository because Jenkins will poll it from there when the pipeline runs.

To do so, I am using the popular GitLens VSCode extension, which I recommend.

GitLens VSCode extension

Select GitLens extension and stage Jenkinsfile file by clicking the plus (+) button.

Select GitLens extension and stage Jenkins file by clicking plus button

Type a commit description “first Jenkinsfile commit” and commit the changes by clicking on the checkmark button.

commit Jenkinsfile changes

Push all changes to the eShopOnWeb GitHub repository. Hit Ctrl+Ship+P to open Command Pallet, type “Push” and “Git: Push to…” option and select eShopOnWeb remote repository on the list. If you don’t have it click on “Add a new remote…”  and follow the instructions.

git push to eShopOnWeb GitHub repository

Now that we have everything set up, let’s run the Jenkins pipeline eShopOnWeb we’ve just created.

Go back to Jenkins in the pipeline eShopOnWeb and click on the Build Now button.

click build now button

You will wait a couple of minutes for the pipeline to execute, and you will see something like this.

pipeline eShopOnWeb result

As you can see, all stages in this pipeline have succeeded! The Jenkins project source code is on our GitHub account.

Congratulations, we’ve reached a relevant part of our goal. We now have one of the most powerful and popular CI/CD tools called Jenkins, taking care of our Software Development life cycle’s important steps.

We saved the end to end test with TestCafe and more for the next post because we don’t want to make this one very long. So let’s make a series of three instead of two as we planned before.

We also have pro bono projects just in case you are interested in learning more about them.

Be aware of why we started this project by clicking here.

Learn more about other posts here.

Contact us for any suggestions. And follow us on FacebookInstagram and Twitter.

If you are a good reader like myself, I recommend the following readings:

  • Docker Quick Start Guide: Learn Docker like a boss, and finally own your applications
  • Docker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous delivery
  • C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
  • Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
  • Continuous Delivery with Docker and Jenkins: Create secure applications by building complete CI/CD pipelines, 2nd Edition, Kindle Edition

See you in the next post!