xUnit, TestCafe, Docker and Jenkins - Sharing is the key

Software Testing Quality Assurance using xUnit, TestCafe along with Docker and Jenkins: an automated approach 3/3

For those who are following me, welcome back and thank you for your support! For those who are visiting for the first time, welcome and enjoy the ride!

Now we are going to close this series of three posts with a golden key. Catching up from our last post, when we saw how important CI/CD is to Software Development focused on Quality Assurance, we met Docker Compose as a useful alternative to a pure docker run command. We also discussed Jenkins and its Pipeline concept; we also saw how to extend Docker images. We also gave some best practices, hints and VSCode outstanding extensions to increase our productivity and avoid bugs. All those matters supported by a Software Testing Quality Assurance, Test-Driven Development overview and a little bit of talking about some important tools such as TestCafe we covered in the first post.

Jenkins Docker TestCafe

Provided you are more acquainted with Docker, Jenkins, TestCafe, we can push them a little bit further.  To automate the stages described in the Jenkins Workflow graph, we extended the Jenkins official Docker image by installing .NET Core SDK 3.1 in it. It is a good approach, but based on the Design Pattern Separation of Concerns, there is one more interesting than that. Why not let Jenkins image do what it does best without having any other necessary software component installed on it? What I am trying to say is, we don’t really need .NET Core SDK 3.1 installed in Jenkins image to restore, clean, build and unit test the eShopOnWeb solution. I will show you why and how right after this important discussion about Design Pattern.

Design Pattern – Separation of concerns

Separation of concerns is a pattern/principle of software architecture design for splitting an application into different parts, so each section addresses a separate concern. The ultimate purpose of the separation of concerns is to construct a well-organized structure. Each component fulfils a significant and intuitive function while optimizing its capacity to adapt to change.

The creation of boundaries achieves the separation of concerns in software architecture. Any logical or physical restriction that determines a given set of duties is a boundary. The use of methods, artifacts, modules, and services to describe core actions within an application would provide some examples of boundaries, tasks, solutions, folder hierarchies for source organization, application layers and organization processing levels.

Separation of concerns – advantages

  1. The lack of repetition and the uniqueness of the individual components’ function makes it easier to manage the overall structure.
  2. As a byproduct of improved maintainability, the system becomes more robust.
  3. The strategies needed to ensure that each element only concerns itself with a single set of coherent obligations often lead to natural points of extensibility.
  4. The decoupling that results from requiring components to concentrate on a single function leads to more easily reused components within the same system in other systems or different contexts.

Remember that, if not all, most of the well-known Design Patterns that we use in Software Engineering apply not only to coding as we know it but also to define the infrastructure and architecture to test and assure the quality of our application.

Going back to the point that I said you would not need .NET Core SDK 3.1 installed in the Jenkins image. The answer is simple: there is a way to run the restore, clean, build and functional, integration and unit tests of eShopOnWeb in a separate container, which can be called by Jenkins container. Furthermore, this container will exist only during the processing time of the task.

Enough theory, let’s make it happen!

We will use the Jenkins project we created in the previous post. Open it in VSCode; let’s create a directory called image2 in the Jenkins root directory and copy Dockerfile, build.sh, docker-compose.yaml, start.ps1 and stop.ps1 to this directory. We will make a few changes to them.

In build.sh we will change the name of the image to sitk/jenkins2 to preserve the original image as shown below.

sitk-jenkins2 image on build script

In docker-compose.yaml file, make the changes highlighted in the image below.

changes to docker-compose

The change in the image and container_name is expected since we want to preserve the previous ones.

The change in the mapped port in the host side to 8081 is not to conflict with the one we set up in the previous post.

We added user: root and /var/run/docker.sock:/var/run/docker.sock to allow Jenkins container to access a remote Docker Server which resides in the Host machine to build images, start, stop, create and remove siblings containers. This involves mounting the host machine’s docker socket to the Jenkins container to start new sibling containers ( note, we are using the word siblings here instead of child containers because the newly created container will run alongside the Jenkins container rather than running inside the Jenkins container).

The reason to change the volume to jenkins-home-volume2 is because we don’t want to interfere in the existing one that is persisted in the host from the previous post.

In Dockerfile we removed .NET Core SDK 3.1 installation and included Docker Community Edition CLI installation since we will start siblings containers from within Jenkins container as stated above. To do this, copy the code below to Dockerfile which is inside image2 and save it.

FROM jenkins/jenkins:lts
 # Switching to root user to install .NET Core SDK
USER root

# Show the distro information during compilation!
RUN uname -a && cat /etc/*release

# Install docker
RUN apt-get update -qq \
    && apt-get install -qqy apt-transport-https ca-certificates curl gnupg2 software-properties-common 
RUN curl -fsSL https://download.docker.com/linux/debian/gpg | apt-key add -
RUN add-apt-repository \
   "deb [arch=amd64] https://download.docker.com/linux/debian \
   $(lsb_release -cs) \
   stable"
RUN apt-get update  -qq \
    && apt-get install docker-ce docker-ce-cli containerd.io -y
RUN usermod -aG docker jenkins    

# Switching back to the jenkins user.
USER jenkins

The start.ps1 and stop.ps1 remain the same.

Now let’s build the new image, go to Windows Terminal, open a bash terminal and under the /mnt/c/sitk/jenkins/image2 run ./build.sh. If everything went well, you should see something like this.

compiling sitk-jenkins2 image

Now we start a Jenkins container based on the newly created image by running start.ps1 in Windows Terminal under c:\sitk\jenkins\image2.

start new Jenkins container

Let’s access Jenkins at http://localhost:8081/ and set it up as we did in the second post. Notice that at this time, we will not need to install the .NET SDK Support plugin for Jenkins. Instead, we will need the Docker Pipeline plugin for Jenkins to dynamically build images, create, start, stop and remove containers in the Jenkins pipeline.

As soon as Jenkins is ready, go to Manage Jenkins -> Manage plugins, click on the Available tab, and in the search field, type “docker” and select Docker Pipeline and click on Install the Without Restart button.

Docker Pipeline plugin

Jenkins will also install the Docker Commons plugin since there is a dependency.

Making the most of the Jenkins plugins ecosystem, I want to present you Allure Test Report.

Allure Test Report

Allure Test Report is a flexible, lightweight multi-language test reporting tool. The test report tool demonstrates a succinct representation of what has been tested in a tidy web report form and helps those involved in the development process to gain maximum valuable knowledge from the regular execution of tests.

How it works

Allure is based on standard xUnit results output but adds some supplementary data. Any report is generated in two steps. In the first step during test execution, a small library called adapter attached to the testing framework saves information about executed tests to XML files. They already provide adapters for popular Java, PHP, Ruby, Python, Scala and C# test frameworks.

To install the Allure Report plugin, go to Manage Jenkins -> Manage plugins, click on the Available tab, and in the search field, type “allure” and select Allure and click on the Install Without Restart button.

Next, go to Manage Jenkins -> Global Tool configuration, scroll until you find the Allure Commandline section. In the Name field inside the Allure Commandline panel, type “Allure 2.13.7”. The version should match with the latest version you select in From Maven Central field as shown below.

setup allure in global tool configuration

We need to install the so-called adapter to the xUnit, the C# unit testing framework we are using on eShopOnWeb, as we mentioned in the first post.  For doing so, install XunitXml.TestLogger package to generate the XML files Allure will need to generate the reports.

Go to the Windows Terminal and under c:\sitk\eShopOnWeb run the following commands.

dotnet add ./tests/FunctionalTests/FunctionalTests.csproj package XunitXml.TestLogger --version 2.0.0
dotnet add ./tests/IntegrationTests/IntegrationTests.csproj package XunitXml.TestLogger --version 2.0.0
dotnet add ./tests/UnitTests/UnitTests.csproj package XunitXml.TestLogger --version 2.0.0

adding XunitXml-TestLogger

To run the Jenkins pipeline using the Docker and Allure plugins, we need a new Jenkinsfile. Therefore, create a new directory called Jenkins under the eShopOnWeb root directory and place a Jenkinsfile with the content below.


pipeline {
    agent none
    stages {
        stage('Checkout') {
            agent { 
                docker { image 'mcr.microsoft.com/dotnet/core/sdk:3.1-bionic'} 
            }
            steps {
                checkout([$class: 'GitSCM', branches: [
                    [name: '*/master']
                ],
                userRemoteConfigs: [
                    [url: 'https://github.com/sitknewnormal/eShopOnWeb.git']
                ]
                ])
            }
        }
        stage('Restore') {
            agent { 
                docker { image 'mcr.microsoft.com/dotnet/core/sdk:3.1-bionic'} 
            }
            steps {
                sh "dotnet restore --packages ./.nuget/packages eShopOnWeb.sln"
            }
        }
        stage('Clean') {
            agent { 
                docker { image 'mcr.microsoft.com/dotnet/core/sdk:3.1-bionic'} 
            }
            steps {
                sh "dotnet clean eShopOnWeb.sln"
            }
        }
        stage('Build') {
            agent { 
                docker { image 'mcr.microsoft.com/dotnet/core/sdk:3.1-bionic'} 
            }
            steps {
                sh "dotnet build eShopOnWeb.sln --no-restore --configuration Release"
            }
        }
        stage('Functional, integration and unit tests (xUnit)') {
            agent { 
                docker { image 'mcr.microsoft.com/dotnet/core/sdk:3.1-bionic'} 
            }
            steps {
                sh "dotnet test ./tests/FunctionalTests/FunctionalTests.csproj --configuration Release --logger xunit --no-build --no-restore --results-directory ./allure-results/FunctionalTests"
                sh "dotnet test ./tests/IntegrationTests/IntegrationTests.csproj --configuration Release --logger xunit --no-build --no-restore --results-directory ./allure-results/IntegrationTests"
                sh "dotnet test ./tests/UnitTests/UnitTests.csproj --configuration Release --logger xunit --no-build --no-restore --results-directory ./allure-results/UnitTests"
            }
        }
        stage('End 2 end eShopOnWeb tests with TestCafe') {
            agent { 
                docker { 
                    image 'testcafe/testcafe'
                    args '--entrypoint=\'\''
                } 
            }
            steps {
                sh "testcafe chromium:headless tests/e2eTests/*_test.js -r spec,xunit:allure-results/e2eTests/TestResults.xml" 
            }
        }
        stage('Publish Reports') {
            agent { 
                docker { image 'openjdk'} 
            }
            steps{
                script {
                    allure ([
                        includeProperties: false, 
                        jdk: '', 
                        results: [
                            [path: 'allure-results/FunctionalTests'],
                            [path: 'allure-results/IntegrationTests'],
                            [path: 'allure-results/UnitTests'],
                            [path: 'allure-results/e2eTests']
                        ]
                    ])
                }            
            }
        }
    }
}

It should look like this.

Jenkinsfile - pipeline with using docker and Allure

Don’t forget to rename the eShopOnWeb git URL if you forked or git cloned the project. You most probably have to change in this URL “https://github.com/sitknewnormal/eShopOnWeb.git” only the sitknewnornal highlighted in red your GitHub username. And most importantly, don’t forget to commit and push it to your repository because Jenkins will poll it from there when the pipeline runs.

There are some important changes I want to point out if we compare the previous Jenkinsfile with the new one.

comparing Jenkinsfiles

In the first Jenkinsfile on the left-hand side, we have all the pipeline stages running on any agent based on the previous post. It means that they will run on Jenkins master or one of the slaves’ containers if there were any. Since we know that there were no additional slaves in the Jenkins we set up previously, it will run in the Jenkins master container.

On the other hand, on the right-hand side, we defined one agent that will run using the official mcr.microsoft.com/dotnet/core/sdk:3.1-bionic for each stage in the pipeline. It means that Jenkins will pull the image from Docker Hub in the first run, create a sibling container that will run along with Jenkins, perform the step and then remove it. It will repeat this process for each stage.

If we focus on the Functional, integration and unit tests (xUnit) stage, we will notice that it will run in a container using the same image used in the previous stages. The logger highlighted in the image below will log the test results in the xunit format in the respective project directories under  ./allure-results directory so that the Jenkins Allure plugin can access and generate the report.

unit tests stage in jenkins pipeline depicted

For the End 2 end eShopOnWeb tests with TestCafe stage, we will run all 5 end-to-end tests in a container using testcafe/testcafe the official image. Notice that to use the testcafe chromium tests/e2eTests -r spec,xunit:allure-results/e2eTests/TestResults.xml command in the step, we needed to override the image entry point using the following argument --entrypoint=\'\' since, as we’ve seen in our previous post, the syntax is a little bit different. The parameter -r spec,xunit:allure-results/e2eTests/TestResults.xmltells TestCafe to report the test result to stdout using the spec argument and xunit the format file for the Allure use and place it in allure-results/e2eTests directory.

And finally, the last stage will publish the reports from all tests using the Jenkins Allure plugin. For doing so, it will run a container based on the OpenJDK official image. Notice that the mentioned plugin will use the results saved in the allure-results paths to generate the reports.

Now that we understood what the Jenkins Pipeline would do, let’s run it.

On Jenkins’s home page, select New Item.

Jenkins New Item

Enter eShopOnWeb as the item name, select Pipeline and click OK.

enter intem name select pipeline and click ok

We suggest you give it a brief description and click on the Pipeline tab.

basic cicd pipeline for eShopOnWeb solution

Select the Pipeline script from SCM option from the Definition field.

select pipeline script from SCM

Select the Git option from the SCM field.

select git from SCM field

Copy eShopOnWeb git URL https://github.com/sitknewnormal/eShopOnWeb.git to the Repository URL field, and make sure the Script Path field is “Jenkins/Jenkinsfile,” and click on the Save button.

Jenkinsfile script path

Before we run this Jenkins Pipeline, we need to run the eShopOnWeb application since there will be end-to-end tests with TestCafe.

For doing so,  make sure you have the MSSQL Server 2019 running. We explained how to do it here.

First, let’s run eShopOnWeb without debugging by selecting the menu Run and then Run Without Debugging as shown below and wait until it automatically opens the browser with the URL https://localhost:5001.
run eShopOnWeb whitout debugging

As soon as the application starts, go back to Jenkins and click on the Build Now button. It will take around 5min to finish the whole Pipeline, which includes the Allure Report publishing.

In the end, you will see something like this.

Jenkins pipeline eShopOnWeb result

If everything went well up to the Publish Report stage, click on the Allure Report icon, a colourful one that appears on the right side of the last execution on the Build History section, which In my case, is #3.

You will see the Allure Report showing that 79 out of 79 test cases have passed. It is not usual to have 100% of the test cases passed, but you will have a number that will help you to work on for sure.

Allure overview

There is much useful information in the Allure Report that you can get most of. It is not the purpose of this post to dig deep into Allure so, take time to explore Allure Report, and be positively surprised by how it can help you work on Software Quality Assurance.

Congratulations, we have made it! We went through some of the most important steps in the Software Development lifecycle using some powerful, cutting-edge tools and concepts to help us develop software better!! The Jenkins project and eShopOnWeb source code are on our GitHub account.

If you need any help or something is going wrong, let me know. It will be my pleasure to help you. If you want to extend some point, we have discussed, tell me so we can cover in the next posts.

We also have pro bono projects just in case you are interested in learning more about them.

Be aware of why we started this project by clicking here.

Learn more about other posts here.

Contact us for any suggestions. And follow us on FacebookInstagram and Twitter.

If you are a good reader like myself, I recommend the following readings:

  1. Docker Quick Start Guide: Learn Docker like a boss, and finally own your applicationsDocker Quick Start Guide: Learn Docker like a boss, and finally own your applications
  2. Docker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous deliveryDocker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous delivery
  3. C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
    C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
  4. Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
    Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
  5. Continuous Delivery with Docker and Jenkins: Create secure applications by building complete CI/CD pipelines, 2nd Edition, Kindle Edition

See you in the next post!

xUnit, TestCafe, Docker and Jenkins - Sharing is the key

Software Testing Quality Assurance using xUnit, TestCafe along with Docker and Jenkins: an automated approach 2/3

This is the second of three posts about QA Testing using xUnit, TestCafe, Docker and let’s add Jenkins to give it an automated approach. For those who missed the previous post, it’s worth seeing it since most of the steps we are going to do here have some pre-requisites we covered there.

What is, what is used for and why Jenkins?

Jenkins logo

Jenkins is an open-source automation platform with lots of plugins designed for Continuous Integration or Continuous Delivery (CI/CD) purposes written in Java. Jenkins is used to constantly create and test software projects, making it easier for developers to incorporate project modifications and making it easier for users to acquire a new build. It also enables you to continuously deliver your applications by integrating with a wide range of testing and deployment technologies.

Continuous Integration / Continuous Delivery

Continuous Integration or Continuous Delivery is a software development practice in which developers are expected to commit changes many times a day or more regularly to the source code in a shared repository. Each commit made in the repository is then compiled. This helps the teams to identify the issues early. Besides this, many other roles depending on the CI/CD tool, such as deploying the build application on the test server, delivering the build and test results to the affected teams.

Besides all of that technical definition, Jenkins is this distinct gentleman in the image that came to serve you and make your life as a developer better. Let’s not take it for granted and make the most of Jenkins.

jenkins plugins

We all know that CI/CD is one of DevOps’ most important parts used to integrate various DevOps stages. For those who have no clue about what DevOps means, it is a set of practices integrating the development of software and IT activities. It aims to shorten the life cycle of system creation and provide high software quality for continuous integration and delivery.

With Jenkins, through automation, not only individual developers but also companies can accelerate software development. Jenkins incorporates all sorts of life-cycle development processes, including documenting, building, testing, packaging, staging, deployment, static analysis, and even more.

Jenkins’ benefits include:

  • With great community support, it is an open-source application.
  • It is straightforward to install using Jenkins docker’s official image.
  • To simplify your job, it has 1000+ plugins. You should code it and share it with the group if a plugin doesn’t exist.
  • It is cost-free.
  • It is Java-built and is, therefore, portable to all major platforms.

CI/CD with Jenkins depicted

Imagine a situation where the application’s full source code has been built and then deployed for testing on the test server. It sounds like the ideal way to build software, but there are several pitfalls in this process. I’m going to try to clarify them:

  • For the test results, developers have to wait until the full software is developed.
  • There is a strong probability that several bugs will be found in the test results. It is difficult for developers to find these bugs since they have to review the entire source code.
  • It delays the delivery process of applications.
  • There is a lack of continuous feedback on coding or architectural problems, build failures, test status and file release uploads, due to which software quality will decline.
  • The method as a whole is manual, raising the likelihood of frequent failure.

From the above-mentioned issues, it is evident that the software delivery process is sluggish, and software quality will likely deteriorate with time. This leads to disappointment with customers. So there is a desperate need for a system to exist to solve such frustration where developers would constantly activate a build and test for any change made in the source code. This is what CI/CD it’s all about. Jenkins is one of the most mature CI/CD platforms available, so let us see how Jenkins conquered the deficiencies in a simplified and self-explanatory workflow diagram below as an example.

simplified CI diagram with Jenkins

As you can see, Jenkins uses the pipeline concept to unleash the power of multiple steps according to parameters you set to perform both basic and complex tasks. When built, pipelines can build code and coordinate the work needed to push applications from commit to delivery.

Let’s make things happen now!

A useful hint I am going to give you is to use docker-compose as much as you can. Docker-compose is a tool for describing Docker multi-container applications and running them. With docker-compose, you use a YAML file to configure services for your application. Then, you build and start all the services from your configuration with a single command. Besides all of that, the YAML file is easy to read, and you will treat it as a source code and track all changes with your preferred SCM tool. Docker-compose is installed as part of the Docker Desktop. If you haven’t installed Docker Desktop yet, go to our first post of a series of four and follow the instructions.

docker-compose

Since I am using Windows, go to Powershell or Windows Terminal and create jenkins directory by running New-Item -Type Directory -Path jenkins command, as shown below.

Jenkins folder

From inside Jenkins directory type code . and hit Enter to open an instance of VSCode.

open vscode from iside jenkins folder

Create a file called docker-compose.yaml inside Jenkins folder.

docker-compose yaml file

Copy this code below into docker-compose.yaml file and save it.

version: '3.8'
services:
  cicd: 
    image: sitk/jenkins:lts
    container_name: sitk_jenkins
    ports:
      - 8080:8080
      - 50000:50000
    volumes:
      - jenkins-home-volume:/var/jenkins_home
volumes:
    jenkins-home-volume:
      driver: local
      driver_opts:
        type: none
        device: c:\sitk\jenkins\jenkins_home
        o: bind

It should look like this.

docker-compose yaml file filled up

YAML is a format for human-readable data serialization that can be used for all programming languages in combination and is also used to write configuration files. It is imperative to keep this file valid, which means that we need to follow some rules and indentations. Therefore I recommend using this popular VSCode extension with more than 4.6MM installs so it can help us to validate the file.

YAML Red Hat VSCode extension

We will use the official Jenkins Docker image jenkins/jenkins:lts and extend it installing .NET Core SDK since we will restore, clean, build and unit test the eShopOnWeb solution.

Docker Jenkins

Building a new Jenkins Docker image

Let’s build the sitk/jenkins:latest image. Create the image folder inside the Jenkins folder, and create a Dockerfile file inside the image folder with the content shown below.

FROM jenkins/jenkins:lts
 # Switching to root user to install .NET Core SDK
USER root

# Show the distro information during compilation!
RUN uname -a && cat /etc/*release

# Based on instructions at https://docs.microsoft.com/en-us/dotnet/core/install/linux-debian#debian-9-
# Installing with APT can be done with a few commands. Before you install .NET SDK, run the following commands 
# to add the Microsoft package signing key to your list of trusted keys and add the package repository.
RUN wget -O - https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.asc.gpg && \
    mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/ && \
    wget https://packages.microsoft.com/config/debian/9/prod.list && \
    mv prod.list /etc/apt/sources.list.d/microsoft-prod.list && \
    chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg && \
    chown root:root /etc/apt/sources.list.d/microsoft-prod.list


# Install the .Net Core SDK, set the path, and show the version of core installed.
RUN apt-get update && \
    apt-get install apt-transport-https && \
    apt-get update && \
    apt-get install -y dotnet-sdk-3.1 && \
    export PATH=$PATH:$HOME/dotnet && \
    dotnet --version

# Switching back to the jenkins user.
USER jenkins

In the end, it should look like this.

sitk Jenkins Dockerfile

Now we will build the sitk/jenkins:latest with .NET Core SDK. Create a build.sh file into the image folder with this command: docker build -t sitk/jenkins . . It should look like this.

sitk Jenkins image build shell script

Go to Windows Terminal, open a Linux terminal. In my case, I have Ubuntu 20.04 running as Windows Subsystem Linux version 2. Go to /mnt/c/sitk/jenkins/image directory and run ./build.sh . If you missed it and want to catch up, go to our first post of a series of four.

After compiling the image, you should see something like this.

sitk jenkins image compilation

Now we are ready to run docker-compose. Go back to the Windows Terminal and type docker-compose up -d and hit Enter. I recommend saving the command in a start.ps1 file, so you don’t need to remember the syntax next time.

cicd service up and running

This command will start a container in detached -d mode with the service cicd which is based on the sitk/jenkins:lts image,…

docker-compose image

…it will name the container sitk_jenkins,…

docker-compose container name

…it will map the container inside ports 8080 and 50000, which are sitting at the right side of the colon, with the same ports at the host, which are sitting at the left side of the colon. Just for curiosity, port 50000 is used in robust environments when you need to attach some Jenkins Slave Agents to run pipelines in parallel. Mapping port 8080 to host is important so we can access the service via http://localhost:8080

docker-compose ports

…and the volumes declaration maps the /var/jenkins_home directory where all Jenkins data resides inside the container, to the c:\sitk\jenkins\jenkins_home in the host. This mapping is essential to manage it and attach it to another container for upgrades purposes, backup Jenkins data. Most importantly, it will also survive the container stop/restart/deletion, which means no Jenkins data loss.

docker-compose volumes

Go back to Windows Terminal, type docker ps -f "name=sitk_" and hit Enter. If everything went right, you would see Jenkins container running.

show Jenkins container running

We are almost there, now go to http://localhost:8080, you will see something like this.

Unlock Jenkins

Follow the instructions on the screen. Remember when we mapped the container /var/jenkins_home directory with the host directory c:\sitk\jenkins\jenkins_home in the YAML file? That’s another useful applicability of docker bind volumes. You don’t need to go inside the container to access files. Therefore, open c:\sitk\jenkins\jenkins_home\secret\initialAdminPassword file, copy the password and paste it to the administrator password field and click continue.

Click on the Install suggested plugins button.

Jenkins Install suggested plugins

And wait a couple of minutes for Jenkins to download and install all plugins.

Installing jenkins plugins.jpg

After the installation of the plugins, they will prompt you to create the First Admin User.

Jenkins first admin user

Type the information required and hit Save and Continue.

fulfill all required info for the Jenkins first admin user

Let the Jenkins URL unchanged (http://localhost:8080/) and hit Save and Continue.

Jenkins instance configuration

Now we are done with Jenkins’s initial setup.

Jenkins is ready

Click on Start using Jenkins button, and there you go!

Jekins login

Before we start creating pipelines for the suggested Jenkins Workflow graph, let me share with you an important hint. Create a Powershell script to stop and remove the Jenkins container with docker-compose rm --stop --force command, name it as stop.ps1, and save it in the same directory you saved the start.ps1 PowerShell script previously. Yes, you heard it right, the command will stop and remove the container and don’t be afraid of doing so because all we need for Jenkins to work next time we start it is in the c:\sitk\jenkins\jenkins_home directory, remember?

So, let’s do this. Go to Windows Terminal under c:\sitk\jenkins directory, type .\stop.ps1 to stop Jenkins’s container.

stop jenkins container

Let’s start it over again with the start Powershell script you previously created.

starting jenkins container

Let’s make sure that Jenkins is up and running because we will need it soon. Visit http://localhost:8080 and enter the admin user credentials you previously created and sign in.

Jekins login

Since we will check out from GitHub restore, clean, build and pack it, we will need .NET SDK Support plugin for Jenkins. Go to Manage Jenkins left panel option, click on Manage Plugins.

MAnage Jenkins manage plugins

Click on the Available tab, and in the search field, type “dotnet” and select  .NET SDK Support and click on the Install Without Restart button.

install NET SDK Support plugin

Please wait until it is completely installed.

NET SDK Support installed

Now we are ready to create the pipeline with some of the stages described in the Jenkins Workflow graph. Let’s use the eShopOnWeb, an ASP .Net Core application we used on posts three and four of our series of four posts. It is worth looking at them, but you can always visit our eShopOnWeb Github repository for more details.

On Jenkins’s home page, select New Item.

Jenkins New Item

Enter eShopOnWeb as the item name, select Pipeline and click OK.

enter intem name select pipeline and click ok

We suggest you give it a brief description and click on the Pipeline tab.

basic cicd pipeline for eShopOnWeb solution

Select the Pipeline script from SCM option from the Definition field.

select pipeline script from SCM

Select the Git option from the SCM field.

select git from SCM field

Copy eShopOnWeb git URL https://github.com/sitknewnormal/eShopOnWeb.git to the Repository URL field, and make sure the Script Path field is “Jenkinsfile,” and click on the Save button.

Copy eShopOnWeb git URL to the Repository URL field

Now we have to prepare Jenkinsfile because it will contain all stages that Jenkins will run in the pipeline. Therefore, go to the eShopOnWeb root folder, create the Jenkinsfile and copy the content below.

pipeline {
    agent any

    stages {
        stage('Checkout') {
            steps {
                git url: 'https://github.com/sitknewnormal/eShopOnWeb.git', branch: 'master'
            }
        }
        stage('Restore') {
            steps {
                sh "dotnet restore eShopOnWeb.sln"
            }
        }
        stage('Clean') {
            steps {
                sh "dotnet clean eShopOnWeb.sln"
            }
        }
        stage('Build') {
            steps {
                sh "dotnet build --configuration Release eShopOnWeb.sln"
            }
        }
        stage('Functional, integration and unit tests (xUnit)') {
            steps {
                sh "dotnet test eShopOnWeb.sln"
            }
        }
    }
}

It should look like this.

Jenkinsfile eShopOnWeb

Don’t forget to rename the eShopOnWeb git URL if you forked or git cloned the project. You most probably have to change in this URL “https://github.com/sitknewnormal/eShopOnWeb.git” only the sitknewnornal highlighted in red to your GitHub username. And most importantly, don’t forget to commit and push it to your repository because Jenkins will poll it from there when the pipeline runs.

To do so, I am using the popular GitLens VSCode extension, which I recommend.

GitLens VSCode extension

Select GitLens extension and stage Jenkinsfile file by clicking the plus (+) button.

Select GitLens extension and stage Jenkins file by clicking plus button

Type a commit description “first Jenkinsfile commit” and commit the changes by clicking on the checkmark button.

commit Jenkinsfile changes

Push all changes to the eShopOnWeb GitHub repository. Hit Ctrl+Ship+P to open Command Pallet, type “Push” and “Git: Push to…” option and select eShopOnWeb remote repository on the list. If you don’t have it click on “Add a new remote…”  and follow the instructions.

git push to eShopOnWeb GitHub repository

Now that we have everything set up, let’s run the Jenkins pipeline eShopOnWeb we’ve just created.

Go back to Jenkins in the pipeline eShopOnWeb and click on the Build Now button.

click build now button

You will wait a couple of minutes for the pipeline to execute, and you will see something like this.

pipeline eShopOnWeb result

As you can see, all stages in this pipeline have succeeded! The Jenkins project source code is on our GitHub account.

Congratulations, we’ve reached a relevant part of our goal. We now have one of the most powerful and popular CI/CD tools called Jenkins, taking care of our Software Development life cycle’s important steps.

We saved the end to end test with TestCafe and more for the next post because we don’t want to make this one very long. So let’s make a series of three instead of two as we planned before.

We also have pro bono projects just in case you are interested in learning more about them.

Be aware of why we started this project by clicking here.

Learn more about other posts here.

Contact us for any suggestions. And follow us on FacebookInstagram and Twitter.

If you are a good reader like myself, I recommend the following readings:

  1. Docker Quick Start Guide: Learn Docker like a boss, and finally own your applicationsDocker Quick Start Guide: Learn Docker like a boss, and finally own your applications
  2. Docker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous deliveryDocker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous delivery
  3. C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
    C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
  4. Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
    Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
  5. Continuous Delivery with Docker and Jenkins: Create secure applications by building complete CI/CD pipelines, 2nd Edition, Kindle Edition

See you in the next post!

xUnit, TestCafe, Docker and Jenkins - Sharing is the key

Software Testing Quality Assurance using xUnit, TestCafe along with Docker and Jenkins: an automated approach 1/3

One of the most important steps in the Sofware Development life cycle is Software Testing Quality Assurance. There are many ways to deliver software with quality in all phases of its development process. It’s undeniable that the earlier you start introducing QA methods and techniques to your development process, the most cost-effective it will be. Furthermore, you will most likely finish it on time.

What is Software Testing Quality Assurance?

Software Testing Quality Assurance is a process to ensure the quality of software products or services offered by an organization to its customers. Quality assurance focuses on improving software production and making it effective and efficient in compliance with software products’ quality standards. Quality Assurance is commonly referred to as QA Testing.

It is usually based on Plan-Do-Check-Act steps, which are evaluated and improved periodically by the organization, most likely supported by tools.

To ensure that the product is designed and delivered with proper protocols, a company must use Quality Assurance. This allows, in the final product, to reduce problems and errors.

Quality Assurance best practises:

  • Create a robust testing environment
  • Select the release criteria carefully.
  • To save money, apply automated testing to high-risk areas. This helps to fasten the whole process.
  • Allocate appropriate time for each phase
  • Prioritizing bug fixes based on the use of apps is important.
  • Form a dedicated team for security and quality testing
  • Simulate tests in a production environment

There is much more to talk about Software Testing Quality Assurance, but it is not our focus on this post.

Software Testing – theory into practice

Lets’s use eShopOnWeb, a sample ASP.NET Core 3.1 reference application running on a docker container and persisting data in a SQL Server database for our Software Testing experiment. You can find more detailed info about preparing this environment here. It is strongly recommended since there are some pre-requisites steps we are not going to cover here.

Hit granularity test with xUnit

xUnit is the collective name for several unit testing frameworks that derive their structure and functionality from Smalltalk’s SUnit. SUnit, designed by Kent Beck in 1998, was written in a highly structured object-oriented style, which lent easily to contemporary languages such as Java and C#.

xUnit testing

Some of the must-have best practices while developing software are building functional, integration and unit tests. In the image below, you will find out that eShopOnWeb BasketAddItem Unit Test uses xUnit, a free, open-source, community-focused unit testing tool for .NET Framework.

eShopOnWeb BasketAddItem unit test

To run all the eShopOnWeb solution tests, go to VSCode Terminal, which will lead to a bash terminal since the solution is running in a Linux container. Make sure you are in the solution root directory, which is /workspaces/eShopOnWeb$  and type dotnet test eShopOnWeb.sln and hit Enter.

running all eShopOnWeb solution tests

Developing into a container will give lots of advantages and huge flexibility if you happen to use different languages and or dependencies in your projects, not to mention that your workstation or laptop will not look like it is cluttered with many stuff installed. To learn more about this, visit our first post of a series of four.

Using TDD (Test Driven Development) approach in the development process is one of the most effective ways to develop functional, integration and unit tests. This is only a useful recommendation!

What is Test-Driven Development?

In software development, it’s a design process. It is based on a concise development cycle being replicated, and the criteria are converted into particular test cases.

These are TDD steps:

  1. Write a unit test that fails.
  2. Write enough code to make the test pass — at this step, we don’t care about good code.
  3. Refactor your code from the previous step.

We will not cover this technique here either, since the focus is to give an in-depth approach to Software Testing using xUnit, TestCafe along with Docker and Jenkins.

TestCafe: testing software from a user perspective

Even though building functional integration and unit tests are important for software testing for back-end and business tier, it is not enough when we are talking about web applications software quality assurance. We will most likely need to test the application in an end-to-end approach. Hence, we cover the frontend and its behaviour in different browsers, devices and screen resolutions simulating a test from a user perspective.

TestCafe logo

To run tests from a user perspective, there are many options, but let’s choose TestCafe, an open-source project which is a pure NodeJS end-to-end solution for web app testing. It takes care of all phases: starting browsers, running tests, collecting data from tests and producing reports. TestCafe doesn’t need browser plugins-it works out-of-the-box on all popular modern browsers.

The installation process for TestCafe is easy, but you might have problems getting stuff up and running locally. TestCafe depends on NodeJS to work. For example, you will need to set up at least one compatible browser on your device. In certain situations, for various purposes, such as having restricted access to your work machine’s user account, you might not be able to execute these instals easily.

There is also the scenario of running your end-to-end tests in various environments, such as continuous integration tools like Jenkins, for instance. The chances are that these systems would not have all the requisite dependencies you need to run your tests. Plus, you’ll need to ensure these systems remain up to date for each dependency.

docker

One of the key advantages of using Docker is that you can use the same image on several operating systems and not worry about anything working on each tester’s device differently. It ensures a consistent image in all environments with everything you need, provided you use the same image version.

Ok, enough of theory. Let’s materialized it. Considering you already have Docker installed in your machine and you have git cloned the eShopOnWeb, the steps we explained with details in the first, second and third posts of a series of four, we are now prepared to test some scenarios using the official and stable TestCafe Docker image.

First, let me explain the scenarios I have prepared. We are going to test 5 scenarios. TestCafe calls each scenario as a fixture:

  1. eShopOnWeb Home page – verifies if the eShopOnWeb’s home page loads properly. Don’t forget to change the IP address of URL (https://192.168.2.217:5001/) shown below to your machine’s (i.e. the docker containers host)
    eShopOnWeb home page load test scenario
  2. eShopOnWeb Login – checks if the user can log in to their account 
    eShopOnWeb login test scenario
  3. eShopOnWeb Use Case 01 checks if the brand filter visual component  exists
    eShopOnWeb use case 01 test scenario
  4. eShopOnWeb Use Case 02 checks if the brand filter is clickable and there is a “.NET” option
    eShopOnWeb use case 02 test scenario
  5. eShopOnWeb Use Case 03 checks if the filter returned the right number of items
    eShopOnWeb use case 03 test scenario

Learning TestCafe is very straightforward and powerful, so take a chance and get to know it!

Before you go to the next step, make sure you have the MSSQL Server 2019 running. We explained how to do it here.

First, let’s run eShopOnWeb without debugging by selecting the menu Run and then Run Without Debugging as shown below and wait until it automatically opens the browser with the URL https://localhost:5001.
run eShopOnWeb whitout debugging

To run the 5 tests, open Powershell or Windows Terminal and run this command: docker run --rm -v c:/sitk/eShopOnWeb/tests/e2eTests:/tests -it testcafe/testcafe chromium /tests/*_test.js  . It will show something like that image below if you haven’t pulled the TestCafe image before.
running 5 scenarios using the TestCafe official image

As you can see, all tests have passed, which is great!!

The docker run command  on a TestCafe image dissected

Let me explain the command: docker run --rm -v c:/sitk/eShopOnWeb/tests/e2eTests:/tests -it testcafe/testcafe chromium /tests/*_test.js.

The command docker run tells Docker to run the image you have specified in its separate container.

The next parameter --rm is an optional parameter that tells Docker to delete the container after it has been completed. By default, Docker maintains the file system of the container when the container commands are ended. It helps retain the file system for debugging or getting some data from the container after running. But it’s best to delete them for short-lived processes like running end-to-end tests to avoid cluttering your disc space.

The parameter-v  tells Docker to mount a directory from your computer to a directory inside the container. The container will access your local test files in this way. In this case, the position of your TestCafe tests in your local environment is “c:\sitk\eShopOnWeb\tests\e2eTests”, and the directory in the Linux container is /tests. Don’t miss the separator colon (:) between the local directories and the container directories.

Next, the parameter -it tells Docker to run the specified image in interactive mode. In short, it sets up an interactive shell to pipe our terminal window into what the container is doing. If you run this command without the -it parameter, the command will still be executed, but its execution will not be correctly displayed on your terminal.

We need to specify the image we want to use for our container after configuring the parameters we need for the Docker. It’s the testcafe/testcafe the official stable image we chose to run it.

Finally, we get to the commands inside the container that we want to execute. For the browser parameter, as these are the pre-installed browsers in the Docker image, you can use either firefox or chromium. You can then use the test file parameter to specify which test files you want to run. In our example, we ran all files ending with *_test.js in the /tests mapped folder inside the container.

Notice that you will not see the browser open up and run your tests even in normal mode when running them into the Docker image. Additional configuration is needed to run graphical interfaces using Docker on your desktop, such as a web browser so that Docker can use your desktop’s display server. It’s not a straightforward solution and differs by the system.

I installed TestCafe on my machine to let you see how it worked when we said it is a user perspective test approach. See the video below and pay attention to the browser’s footer in the video so you can follow TestCafe going through all 5 scenarios.

Congratulations, you have a powerful software development environment, and now you know how to test the software you are developing using cutting-edge technology and methodology. Next, we will wrap it up by adding Jenkins to orchestrate all tests in an automated manner.

We also have pro bono projects just in case you are interested in learning more about them.

Be aware of why we started this project by clicking here.

Learn more about other posts here.

Contact us for any suggestions. And follow us on FacebookInstagram and Twitter.

If you are a good reader like myself, I recommend the following readings:

  1. Docker Quick Start Guide: Learn Docker like a boss, and finally own your applicationsDocker Quick Start Guide: Learn Docker like a boss, and finally own your applications
  2. Docker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous deliveryDocker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous delivery
  3. C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
    C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
  4. Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
    Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition

See you in the next post!

%d bloggers like this: