Regression Testing With docker-compose

Posted by Doug Toppin

Jul 5, 2017 3:21:28 PM Node js, Docker, Docker Containers, docker compose, regression testing, Continuous Integration, cicd

Most systems are composed of multiple applications. An orchestration mechanism is needed to start the applications when needed and in the correct order. A typical modern application may consist of a database such as MongoDB and a web service implemented using NodeJS. The web application may provide a number of API services accessible via a http ReST interface. Testing those services will require test tools that can send requests to the http interface and evaluate the results returned such as http status codes and response content.

For regression testing, particularly in a continuous integration (CI) environment, it is desirable to be able to start up the necessary resources, perform the testing and then remove all of the resources used for the test. Using docker for this works well because the infrastructure services, such as a database, do not already have to be in place. Instead every component necessary is created as a container and removed when complete.

The docker ecosystem includes the docker-compose tool for orchestrating what is necessary to configure and start services as containers. docker-compose provides a convenient facility for being able to start container services that are able to easily communicate with each other. A single docker-compose.yml file can be used to contain a number of services including ones used for invoking regression test scripts.

The following graphic lists the steps involved in the process.

regression-docker.jpg 

 

1. A CI system can invoke a shell script which will run the docker-compose commands to orchestrate the testing
2. The shell script will run docker-compose to start the testing process
3. docker-compose uses a docker-compose.yml file to determine how to run the services
4. the database is started first
5. the node script is then started
6. the node script connects to the database
7. the test script is then invoked
8. the test script begins sending requests to the server for testing

 

After the tests are completed docker-compose is then used to terminate the test facilities.

The docker and docker-compose versions used in this example follow. Other versions may also work correctly.

docker 17.06.0-ce-rc5

docker-compose 1.14.0

An example of a docker-compose.yml file that fits this model follows. This compose file describes db, web and test services and can be used to start and test the facility.

version: '2.1'
services:
    db:
        image: mongo:3
        container_name: db

    web:
        depends_on:
         - db
        container_name: web
        image: node:6.9.1
        ports:
         - "127.0.0.1:3000:3000"
        working_dir: /work
        volumes:
         - .:/work
        environment:
         - MONGO_IP=db
         - NODE_ENV=${NODE_ENV}
        command: node /work/server.js

    test:
        depends_on:
         - db
         - web
        image: node:6.9.1
        working_dir: /work
        volumes:
         - ..:/work
        environment:
         - MONGO_IP=web
         - NODE_ENV=${NODE_ENV}
        command: curl -s http://web:3000

The above docker-compose file can be used with the following source and configuration files that describe and implement a simple web service. Note the use of the depends_on keyword. This tells docker-compose to start the depended on service before subsequent services. However, this does not ensure that the service is actually available and stabilized. It is possible to add health check related functions so that dependent services can control when they actually connect but that beyond of scope of this post.

 

package.json

{
"name": "api",
"version": "1.0.0",
"main": "server.js",
"scripts": {
"start": "node .",
"test": "node test.js"
},
"dependencies": {
"mongodb": "2.2.24",
"express": "latest"
},
"devDependencies": {
"newman": "latest"
},
"repository": {
"type": "",
"url": ""
},
"license": "",
"description": "api"
}

The package.json file describes what NodeJS related packages need to be installed.

 

server.js

var express = require('express');
var app = express();

var MongoClient = require('mongodb').MongoClient;
var assert = require('assert');

var url = 'mongodb://' + process.env.MONGO_IP + ':27017';
MongoClient.connect(url, function(err, db) {
assert.equal(null, err);
console.log("Connected correctly to server.");

app.get('/', function (req, res) {
res.send('Hello World!')
})

app.listen(3000, function () {
console.log('Example app listening on port 3000!')
})

The server.js file is the actual source code for the web service. With this example it only connects to the Mongo database and then waits for a connection from a client on a port.

 

regrtest-compose.sh

#!/bin/sh

docker run -t --rm -w /work -v $(pwd):/work node:6.9.1 npm install

docker-compose -f docker-compose.yml up -d db web

docker-compose -f docker-compose.yml up test

docker-compose -f docker-compose.yml down

regrtest-compose.sh is a shell script used to control how and when the services defined in the docker-compose file are started. First, a NodeJS container is run to perform an npm install of what is necessary to run the NodeJS server defined in server.js. This uses the package.json file to describe what is required.

After installing the required packages the db and web services are started. Note that they are started with the -d argument meaning run them detached from the shell that started them. Otherwise docker-compose would start the db service and block  before continuing because the db service does not exit.

The test service defined in the docker-compose.yml file is then started to perform the defined tests. For this simple example all that is done is to have curl confirm that the web service is accessible by trying to connect to the endpoint.

After the tests are performed, the docker-compose down command is then used to ensure that all provisioned resource containers are exited.

If the above files are created in a directory and regrtest-compose.sh is run you should see output as the package installation is completed, the services are started and the server.js script is tested.

The output should end with something like the following.

...
test_1 | Hello World!answer_test_1 exited with code 0
Stopping web ... done
Stopping db ... done
Removing answer_test_1 ... done
Removing web ... done
Removing db ... done
Removing network answer_default

The key benefits of this approach include that nothing other than docker needed to be installed and the tester did not need to know how to start the services and perform the tests.

This article covers how docker and docker-compose can be used to facilitate testing with a simple example.

Tools such as Postman (https://www.getpostman.com/), which is a UI based tool, can be used to perform ReST interface testing. Postman includes abilities to send requests, check response data, chain a series of requests and report the results. These test collections can be exported to json files that can then be used by Newman (https://www.getpostman.com/docs/newman_intro) which can be run from the shell.

Newman can be run in a container significantly improving the ability to regression test using Docker. For the above example Newman could be used in place of the curl command and can perform much more extensive testing.

Keep an eye on the Vizuri Blog for more articles on how docker can be used in your development, test and production environments. 

Posted by Doug Toppin

Doug Toppin has been involved in software systems design and development in both the commercial and defense realms for more than 30 years. His experience includes small and large scale systems with high processing and availability requirements in a variety of applications.

Find me on:

    

Posts by Topic

see all
Request a Complimentary Docker Consultation

An Open View

Vizuri Blog

Subscribing to our blog is a great way to stay up to date with the latest information from Vizuri, as well as our strategic partners. We focus on providing a range of content that is practically useful and relevant from both a technical and business perspective.

We promise to respect your privacy.

×