Search for:
xUnit, TestCafe, Docker and Jenkins - Sharing is the key

Software Testing Quality Assurance using xUnit, TestCafe along with Docker and Jenkins: an automated approach 2/3

This is the second of three posts about QA Testing using xUnit, TestCafe, Docker and let’s add Jenkins to give it an automated approach. For those who missed the previous post, it’s worth seeing it since most of the steps we are going to do here have some pre-requisites we covered there.

What is, what is used for and why Jenkins?

Jenkins logo

Jenkins is an open-source automation platform with lots of plugins designed for Continuous Integration or Continuous Delivery (CI/CD) purposes written in Java. Jenkins is used to constantly create and test software projects, making it easier for developers to incorporate project modifications and making it easier for users to acquire a new build. It also enables you to continuously deliver your applications by integrating with a wide range of testing and deployment technologies.

Continuous Integration / Continuous Delivery

Continuous Integration or Continuous Delivery is a software development practice in which developers are expected to commit changes many times a day or more regularly to the source code in a shared repository. Each commit made in the repository is then compiled. This helps the teams to identify the issues early. Besides this, many other roles depending on the CI/CD tool, such as deploying the build application on the test server, delivering the build and test results to the affected teams.

Besides all of that technical definition, Jenkins is this distinct gentleman in the image that came to serve you and make your life as a developer better. Let’s not take it for granted and make the most of Jenkins.

jenkins plugins

We all know that CI/CD is one of DevOps’ most important parts used to integrate various DevOps stages. For those who have no clue about what DevOps means, it is a set of practices integrating the development of software and IT activities. It aims to shorten the life cycle of system creation and provide high software quality for continuous integration and delivery.

With Jenkins, through automation, not only individual developers but also companies can accelerate software development. Jenkins incorporates all sorts of life-cycle development processes, including documenting, building, testing, packaging, staging, deployment, static analysis, and even more.

Jenkins’ benefits include:

  • With great community support, it is an open-source application.
  • It is straightforward to install using Jenkins docker’s official image.
  • To simplify your job, it has 1000+ plugins. You should code it and share it with the group if a plugin doesn’t exist.
  • It is cost-free.
  • It is Java-built and is, therefore, portable to all major platforms.

CI/CD with Jenkins depicted

Imagine a situation where the application’s full source code has been built and then deployed for testing on the test server. It sounds like the ideal way to build software, but there are several pitfalls in this process. I’m going to try to clarify them:

  • For the test results, developers have to wait until the full software is developed.
  • There is a strong probability that several bugs will be found in the test results. It is difficult for developers to find these bugs since they have to review the entire source code.
  • It delays the delivery process of applications.
  • There is a lack of continuous feedback on coding or architectural problems, build failures, test status and file release uploads, due to which software quality will decline.
  • The method as a whole is manual, raising the likelihood of frequent failure.

From the above-mentioned issues, it is evident that the software delivery process is sluggish, and software quality will likely deteriorate with time. This leads to disappointment with customers. So there is a desperate need for a system to exist to solve such frustration where developers would constantly activate a build and test for any change made in the source code. This is what CI/CD it’s all about. Jenkins is one of the most mature CI/CD platforms available, so let us see how Jenkins conquered the deficiencies in a simplified and self-explanatory workflow diagram below as an example.

simplified CI diagram with Jenkins

As you can see, Jenkins uses the pipeline concept to unleash the power of multiple steps according to parameters you set to perform both basic and complex tasks. When built, pipelines can build code and coordinate the work needed to push applications from commit to delivery.

Let’s make things happen now!

A useful hint I am going to give you is to use docker-compose as much as you can. Docker-compose is a tool for describing Docker multi-container applications and running them. With docker-compose, you use a YAML file to configure services for your application. Then, you build and start all the services from your configuration with a single command. Besides all of that, the YAML file is easy to read, and you will treat it as a source code and track all changes with your preferred SCM tool. Docker-compose is installed as part of the Docker Desktop. If you haven’t installed Docker Desktop yet, go to our first post of a series of four and follow the instructions.

docker-compose

Since I am using Windows, go to Powershell or Windows Terminal and create jenkins directory by running New-Item -Type Directory -Path jenkins command, as shown below.

Jenkins folder

From inside Jenkins directory type code . and hit Enter to open an instance of VSCode.

open vscode from iside jenkins folder

Create a file called docker-compose.yaml inside Jenkins folder.

docker-compose yaml file

Copy this code below into docker-compose.yaml file and save it.

version: '3.8'
services:
  cicd: 
    image: sitk/jenkins:lts
    container_name: sitk_jenkins
    ports:
      - 8080:8080
      - 50000:50000
    volumes:
      - jenkins-home-volume:/var/jenkins_home
volumes:
    jenkins-home-volume:
      driver: local
      driver_opts:
        type: none
        device: c:\sitk\jenkins\jenkins_home
        o: bind

It should look like this.

docker-compose yaml file filled up

YAML is a format for human-readable data serialization that can be used for all programming languages in combination and is also used to write configuration files. It is imperative to keep this file valid, which means that we need to follow some rules and indentations. Therefore I recommend using this popular VSCode extension with more than 4.6MM installs so it can help us to validate the file.

YAML Red Hat VSCode extension

We will use the official Jenkins Docker image jenkins/jenkins:lts and extend it installing .NET Core SDK since we will restore, clean, build and unit test the eShopOnWeb solution.

Docker Jenkins

Building a new Jenkins Docker image

Let’s build the sitk/jenkins:latest image. Create the image folder inside the Jenkins folder, and create a Dockerfile file inside the image folder with the content shown below.

FROM jenkins/jenkins:lts
 # Switching to root user to install .NET Core SDK
USER root

# Show the distro information during compilation!
RUN uname -a && cat /etc/*release

# Based on instructions at https://docs.microsoft.com/en-us/dotnet/core/install/linux-debian#debian-9-
# Installing with APT can be done with a few commands. Before you install .NET SDK, run the following commands 
# to add the Microsoft package signing key to your list of trusted keys and add the package repository.
RUN wget -O - https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.asc.gpg && \
    mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/ && \
    wget https://packages.microsoft.com/config/debian/9/prod.list && \
    mv prod.list /etc/apt/sources.list.d/microsoft-prod.list && \
    chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg && \
    chown root:root /etc/apt/sources.list.d/microsoft-prod.list


# Install the .Net Core SDK, set the path, and show the version of core installed.
RUN apt-get update && \
    apt-get install apt-transport-https && \
    apt-get update && \
    apt-get install -y dotnet-sdk-3.1 && \
    export PATH=$PATH:$HOME/dotnet && \
    dotnet --version

# Switching back to the jenkins user.
USER jenkins

In the end, it should look like this.

sitk Jenkins Dockerfile

Now we will build the sitk/jenkins:latest with .NET Core SDK. Create a build.sh file into the image folder with this command: docker build -t sitk/jenkins . . It should look like this.

sitk Jenkins image build shell script

Go to Windows Terminal, open a Linux terminal. In my case, I have Ubuntu 20.04 running as Windows Subsystem Linux version 2. Go to /mnt/c/sitk/jenkins/image directory and run ./build.sh . If you missed it and want to catch up, go to our first post of a series of four.

After compiling the image, you should see something like this.

sitk jenkins image compilation

Now we are ready to run docker-compose. Go back to the Windows Terminal and type docker-compose up -d and hit Enter. I recommend saving the command in a start.ps1 file, so you don’t need to remember the syntax next time.

cicd service up and running

This command will start a container in detached -d mode with the service cicd which is based on the sitk/jenkins:lts image,…

docker-compose image

…it will name the container sitk_jenkins,…

docker-compose container name

…it will map the container inside ports 8080 and 50000, which are sitting at the right side of the colon, with the same ports at the host, which are sitting at the left side of the colon. Just for curiosity, port 50000 is used in robust environments when you need to attach some Jenkins Slave Agents to run pipelines in parallel. Mapping port 8080 to host is important so we can access the service via http://localhost:8080

docker-compose ports

…and the volumes declaration maps the /var/jenkins_home directory where all Jenkins data resides inside the container, to the c:\sitk\jenkins\jenkins_home in the host. This mapping is essential to manage it and attach it to another container for upgrades purposes, backup Jenkins data. Most importantly, it will also survive the container stop/restart/deletion, which means no Jenkins data loss.

docker-compose volumes

Go back to Windows Terminal, type docker ps -f "name=sitk_" and hit Enter. If everything went right, you would see Jenkins container running.

show Jenkins container running

We are almost there, now go to http://localhost:8080, you will see something like this.

Unlock Jenkins

Follow the instructions on the screen. Remember when we mapped the container /var/jenkins_home directory with the host directory c:\sitk\jenkins\jenkins_home in the YAML file? That’s another useful applicability of docker bind volumes. You don’t need to go inside the container to access files. Therefore, open c:\sitk\jenkins\jenkins_home\secret\initialAdminPassword file, copy the password and paste it to the administrator password field and click continue.

Click on the Install suggested plugins button.

Jenkins Install suggested plugins

And wait a couple of minutes for Jenkins to download and install all plugins.

Installing jenkins plugins.jpg

After the installation of the plugins, they will prompt you to create the First Admin User.

Jenkins first admin user

Type the information required and hit Save and Continue.

fulfill all required info for the Jenkins first admin user

Let the Jenkins URL unchanged (http://localhost:8080/) and hit Save and Continue.

Jenkins instance configuration

Now we are done with Jenkins’s initial setup.

Jenkins is ready

Click on Start using Jenkins button, and there you go!

Jekins login

Before we start creating pipelines for the suggested Jenkins Workflow graph, let me share with you an important hint. Create a Powershell script to stop and remove the Jenkins container with docker-compose rm --stop --force command, name it as stop.ps1, and save it in the same directory you saved the start.ps1 PowerShell script previously. Yes, you heard it right, the command will stop and remove the container and don’t be afraid of doing so because all we need for Jenkins to work next time we start it is in the c:\sitk\jenkins\jenkins_home directory, remember?

So, let’s do this. Go to Windows Terminal under c:\sitk\jenkins directory, type .\stop.ps1 to stop Jenkins’s container.

stop jenkins container

Let’s start it over again with the start Powershell script you previously created.

starting jenkins container

Let’s make sure that Jenkins is up and running because we will need it soon. Visit http://localhost:8080 and enter the admin user credentials you previously created and sign in.

Jekins login

Since we will check out from GitHub restore, clean, build and pack it, we will need .NET SDK Support plugin for Jenkins. Go to Manage Jenkins left panel option, click on Manage Plugins.

MAnage Jenkins manage plugins

Click on the Available tab, and in the search field, type “dotnet” and select  .NET SDK Support and click on the Install Without Restart button.

install NET SDK Support plugin

Please wait until it is completely installed.

NET SDK Support installed

Now we are ready to create the pipeline with some of the stages described in the Jenkins Workflow graph. Let’s use the eShopOnWeb, an ASP .Net Core application we used on posts three and four of our series of four posts. It is worth looking at them, but you can always visit our eShopOnWeb Github repository for more details.

On Jenkins’s home page, select New Item.

Jenkins New Item

Enter eShopOnWeb as the item name, select Pipeline and click OK.

enter intem name select pipeline and click ok

We suggest you give it a brief description and click on the Pipeline tab.

basic cicd pipeline for eShopOnWeb solution

Select the Pipeline script from SCM option from the Definition field.

select pipeline script from SCM

Select the Git option from the SCM field.

select git from SCM field

Copy eShopOnWeb git URL https://github.com/sitknewnormal/eShopOnWeb.git to the Repository URL field, and make sure the Script Path field is “Jenkinsfile,” and click on the Save button.

Copy eShopOnWeb git URL to the Repository URL field

Now we have to prepare Jenkinsfile because it will contain all stages that Jenkins will run in the pipeline. Therefore, go to the eShopOnWeb root folder, create the Jenkinsfile and copy the content below.

pipeline {
    agent any

    stages {
        stage('Checkout') {
            steps {
                git url: 'https://github.com/sitknewnormal/eShopOnWeb.git', branch: 'master'
            }
        }
        stage('Restore') {
            steps {
                sh "dotnet restore eShopOnWeb.sln"
            }
        }
        stage('Clean') {
            steps {
                sh "dotnet clean eShopOnWeb.sln"
            }
        }
        stage('Build') {
            steps {
                sh "dotnet build --configuration Release eShopOnWeb.sln"
            }
        }
        stage('Functional, integration and unit tests (xUnit)') {
            steps {
                sh "dotnet test eShopOnWeb.sln"
            }
        }
    }
}

It should look like this.

Jenkinsfile eShopOnWeb

Don’t forget to rename the eShopOnWeb git URL if you forked or git cloned the project. You most probably have to change in this URL “https://github.com/sitknewnormal/eShopOnWeb.git” only the sitknewnornal highlighted in red to your GitHub username. And most importantly, don’t forget to commit and push it to your repository because Jenkins will poll it from there when the pipeline runs.

To do so, I am using the popular GitLens VSCode extension, which I recommend.

GitLens VSCode extension

Select GitLens extension and stage Jenkinsfile file by clicking the plus (+) button.

Select GitLens extension and stage Jenkins file by clicking plus button

Type a commit description “first Jenkinsfile commit” and commit the changes by clicking on the checkmark button.

commit Jenkinsfile changes

Push all changes to the eShopOnWeb GitHub repository. Hit Ctrl+Ship+P to open Command Pallet, type “Push” and “Git: Push to…” option and select eShopOnWeb remote repository on the list. If you don’t have it click on “Add a new remote…”  and follow the instructions.

git push to eShopOnWeb GitHub repository

Now that we have everything set up, let’s run the Jenkins pipeline eShopOnWeb we’ve just created.

Go back to Jenkins in the pipeline eShopOnWeb and click on the Build Now button.

click build now button

You will wait a couple of minutes for the pipeline to execute, and you will see something like this.

pipeline eShopOnWeb result

As you can see, all stages in this pipeline have succeeded! The Jenkins project source code is on our GitHub account.

Congratulations, we’ve reached a relevant part of our goal. We now have one of the most powerful and popular CI/CD tools called Jenkins, taking care of our Software Development life cycle’s important steps.

We saved the end to end test with TestCafe and more for the next post because we don’t want to make this one very long. So let’s make a series of three instead of two as we planned before.

We also have pro bono projects just in case you are interested in learning more about them.

Be aware of why we started this project by clicking here.

Learn more about other posts here.

Contact us for any suggestions. And follow us on FacebookInstagram and Twitter.

If you are a good reader like myself, I recommend the following readings:

  • Docker Quick Start Guide: Learn Docker like a boss, and finally own your applications
  • Docker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous delivery
  • C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
  • Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
  • Continuous Delivery with Docker and Jenkins: Create secure applications by building complete CI/CD pipelines, 2nd Edition, Kindle Edition

See you in the next post!

xUnit, TestCafe, Docker and Jenkins - Sharing is the key

Software Testing Quality Assurance using xUnit, TestCafe along with Docker and Jenkins: an automated approach 1/3

One of the most important steps in the Sofware Development life cycle is Software Testing Quality Assurance. There are many ways to deliver software with quality in all phases of its development process. It’s undeniable that the earlier you start introducing QA methods and techniques to your development process, the most cost-effective it will be. Furthermore, you will most likely finish it on time.

What is Software Testing Quality Assurance?

Software Testing Quality Assurance is a process to ensure the quality of software products or services offered by an organization to its customers. Quality assurance focuses on improving software production and making it effective and efficient in compliance with software products’ quality standards. Quality Assurance is commonly referred to as QA Testing.

It is usually based on Plan-Do-Check-Act steps, which are evaluated and improved periodically by the organization, most likely supported by tools.

To ensure that the product is designed and delivered with proper protocols, a company must use Quality Assurance. This allows, in the final product, to reduce problems and errors.

Quality Assurance best practices:

  • Create a robust testing environment
  • Select the release criteria carefully.
  • To save money, apply automated testing to high-risk areas. This helps to fasten the whole process.
  • Allocate appropriate time for each phase
  • Prioritizing bug fixes based on the use of apps is important.
  • Form a dedicated team for security and quality testing
  • Simulate tests in a production environment

There is much more to talk about Software Testing Quality Assurance, but it is not our focus on this post.

Software Testing – theory into practice

Lets’s use eShopOnWeb, a sample ASP.NET Core 3.1 reference application running on a docker container and persisting data in a SQL Server database for our Software Testing experiment. You can find more detailed info about preparing this environment here. It is strongly recommended since there are some pre-requisites steps we are not going to cover here.

Hit granularity test with xUnit

The xUnit is the collective name for several unit testing frameworks that derive their structure and functionality from Smalltalk’s SUnit. SUnit, designed by Kent Beck in 1998, was written in a highly structured object-oriented style, which lent easily to contemporary languages such as Java and C#.

xUnit testing

Some of the must-have best practices while developing software are building functional, integration and unit tests. In the image below, you will find out that eShopOnWeb BasketAddItem Unit Test uses xUnit, a free, open-source, community-focused unit testing tool for .NET Framework.

eShopOnWeb BasketAddItem unit test

To run all the eShopOnWeb solution tests, go to VSCode Terminal, which will lead to a bash terminal since the solution is running in a Linux container. Make sure you are in the solution root directory, which is /workspaces/eShopOnWeb$  and type dotnet test eShopOnWeb.sln and hit Enter.

running all eShopOnWeb solution tests

Developing into a container will give lots of advantages and huge flexibility if you happen to use different languages and or dependencies in your projects, not to mention that your workstation or laptop will not look like it is cluttered with many stuff installed. To learn more about this, visit our first post of a series of four.

Using TDD (Test Driven Development) approach in the development process is one of the most effective ways to develop functional, integration and unit tests. This is only a useful recommendation!

What is Test-Driven Development?

In software development, it’s a design process. It is based on a concise development cycle being replicated, and the criteria are converted into particular test cases.

These are TDD steps:

  1. Write a unit test that fails.
  2. Write enough code to make the test pass — at this step, we don’t care about good code.
  3. Refactor your code from the previous step.

We will not cover this technique here either, since the focus is to give an in-depth approach to Software Testing using xUnit, TestCafe along with Docker and Jenkins.

TestCafe: testing software from a user perspective

Even though building functional integration and unit tests are important for software testing for back-end and business tier, it is not enough when we are talking about web applications software quality assurance. We will most likely need to test the application in an end-to-end approach. Hence, we cover the frontend and its behaviour in different browsers, devices and screen resolutions simulating a test from a user perspective.

TestCafe logo

To run tests from a user perspective, there are many options, but let’s choose TestCafe, an open-source project which is a pure NodeJS end-to-end solution for web app testing. It takes care of all phases: starting browsers, running tests, collecting data from tests and producing reports. TestCafe doesn’t need browser plugins-it works out-of-the-box on all popular modern browsers.

The installation process for TestCafe is easy, but you might have problems getting stuff up and running locally. TestCafe depends on NodeJS to work. For example, you will need to set up at least one compatible browser on your device. In certain situations, for various purposes, such as having restricted access to your work machine’s user account, you might not be able to execute these instals easily.

There is also the scenario of running your end-to-end tests in various environments, such as continuous integration tools like Jenkins, for instance. The chances are that these systems would not have all the requisite dependencies you need to run your tests. Plus, you’ll need to ensure these systems remain up to date for each dependency.

docker

One of the key advantages of using Docker is that you can use the same image on several operating systems and not worry about anything working on each tester’s device differently. It ensures a consistent image in all environments with everything you need, provided you use the same image version.

Ok, enough of theory. Let’s materialized it. Considering you already have Docker installed in your machine and you have git cloned the eShopOnWeb, the steps we explained with details in the firstsecond and third posts of a series of four, we are now prepared to test some scenarios using the official and stable TestCafe Docker image.

First, let me explain the scenarios I have prepared. We are going to test 5 scenarios. TestCafe calls each scenario as a fixture:

  • eShopOnWeb Home page – verifies if the eShopOnWeb’s home page loads properly. Don’t forget to change the IP address of URL (https://192.168.2.217:5001/) shown below to your machine’s (i.e. the docker containers host)
eShopOnWeb home page load test scenario
  • eShopOnWeb Login – checks if the user can log in to their account
eShopOnWeb login test scenario
  • eShopOnWeb Use Case 01 – checks if the brand filter visual component  exists
eShopOnWeb use case 01 test scenario
  • eShopOnWeb Use Case 02 – checks if the brand filter is clickable and there is a “.NET” option
eShopOnWeb use case 02 test scenario
  • eShopOnWeb Use Case 03 – checks if the filter returned the right number of items
eShopOnWeb use case 03 test scenario

Learning TestCafe is very straightforward and powerful, so take a chance and get to know it!

Before you go to the next step, make sure you have the MSSQL Server 2019 running. We explained how to do it here.

First, let’s run eShopOnWeb without debugging by selecting the menu Run and then Run Without Debugging as shown below and wait until it automatically opens the browser with the URL https://localhost:5001.

run eShopOnWeb whitout debugging

To run the 5 tests, open Powershell or Windows Terminal and run this command: docker run --rm -v c:/sitk/eShopOnWeb/tests/e2eTests:/tests -it testcafe/testcafe chromium /tests/*_test.js  . It will show something like that image below if you haven’t pulled the TestCafe image before.

running 5 scenarios using the TestCafe official image

As you can see, all tests have passed, which is great!!

The docker run command  on a TestCafe image dissected

Let me explain the command: docker run --rm -v c:/sitk/eShopOnWeb/tests/e2eTests:/tests -it testcafe/testcafe chromium /tests/*_test.js.

The command docker run tells Docker to run the image you have specified in its separate container.

The next parameter --rm is an optional parameter that tells Docker to delete the container after it has been completed. By default, Docker maintains the file system of the container when the container commands are ended. It helps retain the file system for debugging or getting some data from the container after running. But it’s best to delete them for short-lived processes like running end-to-end tests to avoid cluttering your disc space

The parameter-v  tells Docker to mount a directory from your computer to a directory inside the container. The container will access your local test files in this way. In this case, the position of your TestCafe tests in your local environment is “c:\sitk\eShopOnWeb\tests\e2eTests”, and the directory in the Linux container is /tests. Don’t miss the separator colon (:) between the local directories and the container directories.

Next, the parameter -it tells Docker to run the specified image in interactive mode. In short, it sets up an interactive shell to pipe our terminal window into what the container is doing. If you run this command without the -it parameter, the command will still be executed, but its execution will not be correctly displayed on your terminal.

We need to specify the image we want to use for our container after configuring the parameters we need for the Docker. It’s the testcafe/testcafe the official stable image we chose to run it.

Finally, we get to the commands inside the container that we want to execute. For the browser parameter, as these are the pre-installed browsers in the Docker image, you can use either firefox or chromium. You can then use the test file parameter to specify which test files you want to run. In our example, we ran all files ending with *_test.js in the /tests mapped folder inside the container.

Notice that you will not see the browser open up and run your tests even in normal mode when running them into the Docker image. Additional configuration is needed to run graphical interfaces using Docker on your desktop, such as a web browser so that Docker can use your desktop’s display server. It’s not a straightforward solution and differs by the system.

installed TestCafe on my machine to let you see how it worked when we said it is a user perspective test approach. See the video below and pay attention to the browser’s footer in the video so you can follow TestCafe going through all 5 scenarios.

Congratulations, you have a powerful software development environment, and now you know how to test the software you are developing using cutting-edge technology and methodology. Next, we will wrap it up by adding Jenkins to orchestrate all tests in an automated manner.

We also have pro bono projects just in case you are interested in learning more about them.

Be aware of why we started this project by clicking here.

Learn more about other posts here.

Contact us for any suggestions. And follow us on FacebookInstagram and Twitter.

If you are a good reader like myself, I recommend the following readings:

  • Docker Quick Start Guide: Learn Docker like a boss, and finally own your applications
  • Docker for Developers: Develop and run your application with Docker containers using DevOps tools for continuous delivery
  • C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
C# and .NET Core Test-Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications, Kindle Edition
  • Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition
Modern Web Testing with TestCafe: Get to grips with end-to-end web testing with TestCafe and JavaScript 1st Edition, Kindle Edition

See you in the next post!

%d bloggers like this: