MPF Unittest automation with Bitbucket Pipelines

The Mission Pinball Framework (http://missionpinball.org) supports unittests (http://developer.missionpinball.org/en/latest/testing/index.html) which is super awesome.  We all should be writing and running unittests for the code we write, even if thats just writing yaml files for MPF.

First off running the unittests in your local development environment should be as simple as a single command:

Python -m unittest

I do not recommend away from runing the tests locally, in fact I will in the process of explaining how to use Bitbucket Pipelines (https://bitbucket.org/product/features/pipelines) I will give you a quick script that you can use to ensure the code your about to commit "should" pass in pipelines.

So if we can run the test locally why use pipelines?

Well humans are forgetful, its likely over the long period of time that I will be building a pinball machine I am going to forget to run tests so I want a way that ensures they get run and what is easier than having them run for you when you commit onto your master branch.

Why only on the master branch?

Well I only have a free plan Bitbucket account and you get 50 mins of build time a month free. Which should be plenty of builds especially as my machine is in its early stages. If I find that as I progress I am no where near my 50 min limit I might consider doing tests on the development branch. At least with my master branch being tested automattically I know when I merge into master the test will fire and I will have feedback on how my machine code is doing.

Docker Image

So Bitbucket Pipelines uses a Docker image to run your plan. Now I could have taken an off the shelf image of say Ubuntu and just wrote a heap of commands that need to run everytime I perform a pipeline. This would mean I need to install all the extra pre-requisit packages and get MPF installed inside the docker image on every pipeline build, but all this takes time and remember we only have 50 free mins. So to get started I first created a new Docker image with all the needed stuff baked in.

Of course I wrote up a blog earlier on how to run MPF on Ubuntu and I could have just used these steps to create an Ubuntu image with MPF. However, the other thing that takes build time is downloading the image and if I use the Ubuntu image and install MPF inside it was getting upwards of 1.3GB image size. I know I can do better. The base Ubuntu image is over 600MB if I change to using Alpine linux this comes down to 3.99MB!! Great the underlying OS is a heap smaller so I could just stop there and be done with it. BUT how you write your Dockerfile can drastically affect your Docker image size. First up if we put each command to run in setting up MPF on its own RUN line we end up with many many layers (one per command). So first of all we compact all the setup commands into one big RUN line. Next we have to install some dev packages to install the required packages, after these are built we uninstall them and reduce the size of our image.

Once again I do not want to make images and push them up to Dockerhub myself so I made my repo containing my Dockerfile public (https://bitbucket.org/mikef_aus/mpf-docker/src) and configured dockerhub to create a new latest image (https://hub.docker.com/r/mfuller/mpf-docker/) if I push to master on my Dockerfile repo.

Once I pushed my new Dockerfile to the repo dockerhub kicks into action and created me my new image.

Now anyone can use my Docker image for unittests, but it could be used for other things like possibly running MPF inside a Docker container (untested as yet).

Using the Docker Image locally

One of the great things about docker is portable, consistent and isolated environments. This means if you use a Docker image for your build tests they should not be affected by your local machine configuration and you know when they pass locally they should pass fine in your CI/CD pipelines.

You can pull down my docker image of Alpine Linux with MPF like this. Once you have docker installed on your local machine (https://www.docker.com):

docker pull mfuller/mpf-docker

Now you have the image downloaded ready to run whenever you need. In order to make running your MPF machine unittests simple I added a script to the base of my machine folder called test.sh with the following contents:

#!/bin/sh
if [ -f /proc/1/cgroup ] && [ $(grep -c docker /proc/1/cgroup) -gt 0 ]; then
  python3 -m unittest
else
  docker run -v `pwd`:/mpf -it mfuller/mpf-docker /mpf/test.sh
fi

Once you create and make this script executable you can then just run ./test.sh from the base of your MPF machine folder and your docker image will be spun up and unittests run.

I plan on doing some more testing with this image later on actually running MPF inside the container. In theory it works, adding the FTDI driver to the image and mapping the USB device should allow it to run your real pinball machine. More on this later.

How to setup automated unittests

So we have my docker image, now I can configure Bitbucket Pipelines. This is pretty simple if your codebase is in Bitbucket already. If not you would need to migrate your repo to Bitbucket. Also note this repo can be private so you're not giving the world all your code. Once you have your repo in Bitbucket add a new file to the base of your repo called bitbucket-pipelines.yml with the following code:

image:  mfuller/mpf-docker:latest

pipelines:
  branches:
    master:
      - step:
          script:
            - python3 -m unittest

Then commit the file and push to master. You should see Bitbucket pipelines spin up a build job, download our MPF Docker image and then startup your unitests if all goes well you get a green tick and you have now ensured your code will get unit tested when you push up to your repo.

Comments

Popular posts from this blog

Pinball Compute Setup

Using Trello to track progress