In this post we will take a look at Continuous Delivery using Tox and Bitbucket Pipelines. We will setup a basic Django project, use Tox to automate our testing and push our project to a Bitbucket repository. Then we will enable Bitbucket Pipelines to run our automated testing when we push new code to our remote repository.

Important: The Bitbucket Pipelines product is currently still in Beta. It's possible to sign up for Beta access and in my experience Atlassian will quickly get you an account.

What is Continuous Delivery?

When we talk about Continuous Delivery it's very easy to start looking like buzzword spewing startup hipster. Don't get me wrong, I love it and think it's essential in any project.

In essence, Continuous Delivery is a collection of tools, methods and habits that help you to create stable, bug free and reliable software.

When used correctly, Continuous Delivery enables you to test your software at any time, on many platforms and on various software versions. It is continuous because the automated testing you use is started after someone pushes a new commit to a repository. A good solution is also automated, it doesn't require you or the other developers on the project to intervene unless things start catching fire.

Getting our demo project up and running

Let's get started. I'm going to use PyCharm 5 as my IDE but you can tag along with your IDE or editor of choice. I won't be doing anything that you can't do with Vim or SublimeText or .

I'm going to call our new project pipetox. I find it fun and interesting to give a demo a bit of background so let's pretend we're building a management tool for maintaining pipes that guide toxic waste. Hence the name pipetox. Of course you can name your project whatever you'd like just make sure you substitute the name intelligently and don't copy paste like a scriptkiddie.

First create a new virtualenv for our local development. I like to store it in the same directory as my project root but proceed as you'd like. Next create an empty Django project. For simplicity we won't be using a cookiecutter template, just get a Django project running quick and dirty.

Now that we have this nice demo project up and running we don't want to mess things up. Let's make sure our project is off to a good start and get things under testing.

Our first test cases

Let's get a bit of Test Driven Development under way. When we visit the homepage of our demo project we want to display a nice greeting to our users. Let's write a test case for our homepage. But where to put our test case? We only have a blank Django project!

A personal preference of mine is to bundle all generic functionality of a project under a Django app called core. I usually put my generic views here and functionality that isn't tied to particular piece of business or application logic.

Now that we have an app to put our test case, let's write our test:

from django.test import TestCase


class HomepageTestCase(TestCase):
    def test_homepage_welcome(self):
        """Visiting the application homepage should display a nice welcome message"""
        response = self.client.get('/')
        self.assertEqual(response.status_code, 200)
        self.assertContains(response, "Welcome to PipeTox!")

Of course, running this test case fails gloriously because we didn't write our homepage yet. Go on and write a quick view, template and urlconf to get your test case to pass.

Now we've reached a point where our application has some body to it. I created my version of our demo project on Python 3.5 and Django version 1.9.7. Our test case runs in our local virtualenv so we know this combination of Python and Django works. It would be great if we could test our project on different versions without too much trouble.

Automated testing using Tox

This is where Tox enters the spotlight. Tox is a tool to automate and standardize testing in Python. Simply put, you define a series of testing environments and tell Tox to go ahead and run your tests. Tox handles the virtualenv creation, installation of your dependencies and runs your tests.

To get started, install Tox in your local virtualenv:

$ pip install tox

Then we put the Tox configuration in a tox.ini file inside your project. Here is a basic configuration file to get you started:

[tox]
envlist = py27,py35
skipsdist = true

[testenv]
deps = -r{toxinidir}/requirements.txt
commands = python manage.py test

Let's go over the configuration file. If you're not familiar with the INI file format now is the time to use your Google kung-fu 😉

In the [tox] section we define a list of environments to run our tests in. Here we defined a Python 2.7 and 3.5 environment. The line skipsdist = true tells Tox to skip the setup.py step since we don't have any in our project.

The next section is called [testenv] where we configure our specific testing environment. Here we use the deps directive to direct Tox to the requirements.txt file we want to use. You did use a requirements.txt file for your dependencies, right?

Next up is the commands directive that tells Tox what command to execute. Here we simply run the Django test runner on our project.

Running our automated tests

Now that our Tox configuration is complete, we can execute our automated testing by simply running the tox command in our virtualenv:

Running Tox to perform automated testing

Here we can see that Tox created two virtualenvs, one for Python 2.7 and one for 3.5 and ran our tests in each virtualenv. Tox then reported the successful execution of our test cases.

With just a simple config file we tested our project on two versions of Python and ran it in a matter of seconds.

I strongly recommend you read more about Tox in the official documentation to see what else is possible. One common use case is to test your code on different versions of a particular library. This makes it very easy to verify your Django app runs on Django 1.7, 1.8 and 1.9

Continuous Delivery using Bitbucket Pipelines

When I was talking about Continuous Delivery I specifically mentioned the automated part. Now we still have to manually run Tox when we want to test our code base. Not bad, but remember that a good programmer is a lazy programmer.

Let's create a new Bitbucket repository to host our demo project and initialize a git repository in our projects root folder. Then we will push our master branch to our repository.

Create a new Bitbucket repository

git init
git add *
git commit -m "Initial commit"
git remote add origin git@bitbucket.org:sinaxgcv/pipetox.git
git push -u origin master

Note: Make sure to add the .tox directory to your .gitignore, otherwise you'll add all tox virtualenvs to your repository.

Next we need to active Pipelines on our Bitbucket repository. This is a pretty fool proof step where you just check the correct boxes in your repository settings. Here we selected the default Python configuration for Pipelines.

bitbucket-pipelines-1

You can create the required .yml config file yourself or create it using the Bitbucket UI. Here I changed the default config file so we just run the tox command in our repository root. This will then run our tests using Tox like we did locally.

bitbucket-pipelines-2

After you commit the bitbucket-pipelines.yml file to your repository (manually or through the Bitbucket UI) Pipelines will start running a docker container to execute your commands.

bitbucket-pipelines-status

Some moments later you will see the notification that your Pipeline proces.... failed. What the.. Let's check out the details of proces by clicking on the commit message.

Screen Shot 2016-06-10 at 16.10.06

Apparantly, our testenv for Python 2.7 failed because the interpreter is unknown. After some minor headscratching I immediately realized why. Pipelines runs your test suite in a docker container, and the default container selected by Bitbucket for Python has no Python 2.7 installed.

It took me some time to find a suitable Docker image that had Python 2 and Python 3 installed, building a custom one myself wasn't something I felt like doing at this time.

Change your bitbucket-pipelines.yml file to use the new Docker image like so:

image: parsys/python2-python3-pil-supervisord-redis

pipelines:
  default:
    - step:
        script:
          - pip install -U tox
          - tox

Commit your changes and push them to your repository. Bitbucket will pick up the new commit and kick of a new Pipelines process that will manage run your Continuous Delivery cycle:

bitbucket-pipelines-status-2

And voila, our Tox tests have been successfully run by Pipelines! Also interesting is the integration of Pipelines in the Bitbucket views of your project. It is very easy to see what commits broke the Continuous Delivery cycle and which ones didn't.

Getting to the delivery

The next important part of the Continuous Delivery cycle is up next. After we have verified that a commit didn't break anything the next logical thing is to trigger a deployment of the new build to our staging (or production) environment.

While the practical side of this step depends on your deployment environment, it's actually a very simple step. After calling your automated tests you simply call the script that will handle your code deployment:

image: parsys/python2-python3-pil-supervisord-redis

pipelines:
  default:
    - step:
        script:
          - pip install -U tox
          - tox
          - python my_deployment.py

Bitbucket Pipelines will halt execution of your pipeline if one of the steps fails. If your automated tests using To fail the deployment script won't be executed. If your tests are all in the green then the new build will be deployed by your script.

The Bitbucket Pipelines documentation is a bit sparse at the moment but will get you going. I suggest playing around with various build steps depending on what branch you push to. A scenario I already have in use with my team is that commits to our production branch get deployed to production (after testing of course), commits to the staging branch go to staging and so on.

Conclusion

Continuous Delivery is something that may sound like a buzzword but is actually very important in any project. It helps a software team (or even an individual developer) to  produce stable and bug free code and keep deploying new versions faster and more efficiently.

In this article we used Bitbucket Pipelines but there are alternatives out there such as Jenkins and others. In my opinion they are a bit harder to integrate with your source control but a good tool is an investment, so select the tools that make you the most productive!

One Thought on “Continuous delivery with Tox and Bitbucket Pipelines”

Leave a Reply

Your email address will not be published. Required fields are marked *