Documentation Driven Development

I’m by no means the first to propose this approach: I first heard the phrase “Readme Driven Development” from Tom Preston-Werner in 2010, and there he referenced Documentation Driven Development:

It’s important to distinguish Readme Driven Development from Documentation Driven Development. RDD could be considered a subset or limited version of DDD.

A quick google(v.) gives me various results for “documentation driven development”:

…and that’s just from the first page of results!

So lets be clear: I’m not claiming to have invented anything here; I’m just distilling the various sources into my own thoughts.

I recently got around to reading The Year Without Pants by Scott Berkun, which details his actually-more-than-a-year working for Automattic. When he was describing a general workflow for updating WordPress.com, this caught my attention:

Write a launch announcement and a support page. Most features are announced to the world after they go live on WordPress.com . But long before launch, a draft launch announcement is written. This sounds strange. How can you write an announcement for something that doesn’t exist? The point is that if you can’t imagine a compellingly simple explanation for customers, then you don’t really understand why the feature is worth building. Writing the announcement first is a forcing function. You’re forced to question if your idea is more exciting for you as the maker than it will be for your customer. If it is, rethink the idea or pick a different one.

This reminded me of Tom Preston-Werner’s approach, and set me thinking about the problem again.

A README file or launch announcement (or release note) are user facing, but users aren’t our only audience. If writing those things first help us distil our thoughts about what we are going to deliver to our users, then doing the same thing for our commit messages or – taking it to the extreme – comments will help us stay focused too. This can be particularly relevant when fixing bugs/issues/defects, as you will generally go into them with a clear idea of what you are going to do to address them.

Of course, just like TDD isn’t always practical – e.g., exploratory spikes – so too can DDD not always be used. Nor should your documentation be set in stone: good documentation lives and breathes alongside the code.

Using Travis CI for testing Django projects

A couple of weeks months1 ago in my post about using tox with Django projects, I mentioned using Travis CI as well as tox for testing.

There’s plenty of documentation out there for Python modules and Django applications, but not so much guidance for testing a complete Django project. I can only speculate that this is because testing projects normally involves more moving parts (e.g., databases) as opposed to applications which are supposed to be self-contained.

A good example here is the cookiecutter templates of Daniel Greenfeld – one of the authors of Two Scoops of Django. His djangopackage template contains a .travis.yml file, yet his django[project] template doesn’t. Since many consider these templates best practices, and Two Scoops as a Django “bible”, perhaps I’m wrong to want to use CI on a Django project?

Well (naturally) I don’t think I am, so here is how to do it.

Travis CI has a whole bunch of services you can use in your tests. The obvious ones are there e.g., PostgreSQL, MySQL, and Memcached. For a more complete system, there’s also Redis, RabbitMQ, and ElasticSearch. Using these you can build a pretty complete set of integration tests using the same components you will in a production environment.

For the purposes of testing capomastro2, we only need a PostgreSQL database 3. The first step is to say we want PostgreSQL available during tests:

services:
– postgresql

Now we need to create the database, using the postgres user provided by Travis CI:

  psql -c 'create database capomastro;' -U postgres

The next part is to configure our Django project to use this database. Fortunately our project already provides a sample local_settings.py that is configured for connecting to PostgreSQL on localhost without a password, so all we need to do is modify this file to use the same postgres user:

  cp capomastro/local_settings.py.example capomastro/local_settings.py
  sed -i -e 's/getpass.getuser()/"postgres"/g' capomastro/local_settings.py

Finally we can call the Django syncdb (and migrate because like everyone else, we use South) to setup our database:

  python manage.py syncdb --migrate --noinput

All of this is done in the before_script hook in Travis CI:

before_script:
– cp capomastro/local_settings.py.example capomastro/local_settings.py
– psql -c 'create database capomastro;' -U postgres
– sed -i -e 's/getpass.getuser()/"postgres"/g' capomastro/local_settings.py
– python manage.py syncdb –migrate –noinput

We can now execute the full test suite for the project, using the same database we use in development and production!

For reference, here is the complete .travis.yml file:

language: python
services:
– postgresql
python:
– 2.7
install:
– pip install -r dev-requirements.txt
before_script:
– cp capomastro/local_settings.py.example capomastro/local_settings.py
– psql -c 'create database capomastro;' -U postgres
– sed -i -e 's/getpass.getuser()/"postgres"/g' capomastro/local_settings.py
– python manage.py syncdb –migrate –noinput
script:
– python manage.py test


  1. Wow, this post has been sitting in drafts for quite a while! 
  2. master builder“ 
  3. A lot of people deploy against PostgreSQL, but develop and test against SQLite for speed and convenience. This will eventually bite them. 

Using tox with Django projects

Today I was adding tox and Travis-CI support to a Django project, and I ran into a problem: our project doesn’t have a setup.py. Of course I could have added one, but since by convention we don’t package our Django projects (Django applications are a different story) – instead we use virtualenv and pip requirements files – I wanted to see if I could make tox work without changing our project.

Turns out it is quite easy: just add the following three directives to your tox.ini.

In your [tox] section tell tox not to run setup.py:

skipsdist = True

In your [testenv] section make tox install your requirements (see here for more details):

deps = -r{toxinidir}/dev-requirements.txt

Finally, also in your [testenv] section, tell tox how to run your tests:

commands = python manage.py test

Now you can run tox, and your tests should run!

For reference, here is a the complete (albeit minimal) tox.ini file I used:

[tox]
envlist = py27
skipsdist = True

[testenv]
deps = -r{toxinidir}/dev-requirements.txt
setenv =
    PYTHONPATH = {toxinidir}:{toxinidir}
commands = python manage.py test