rapaul.com

A technical blog written by Richard Paul

Auckland Coderetreats 2012

Corey Haines was in town last month and delivered a great talk at Codemania. Fortunately for us he stuck around for the weekend and facilitated Auckland’s first (as far as I know) Coderetreat.

Global Day of Coderetreat, Saturday 8th December 2012

On the 8th of December developers from all over the globe will be taking part in the Global Day of Coderetreat. The last Auckland Coderetreat filled up quick, this December we are fortunate enough to have 2 Coderetreats in Auckland.

Each of these Coderetreats are language agnostic, so take your pick. I’ll be facilitating the Movio Coderetreat, so that may sway your choice either way :P

Alex Henderson is facilitating the BizDojo Coderetreat and has written about it over on his blog.

Thanks to the great sponsors of these events there is no cost! To top that off there will be delicious food and drink provided throughout the day.

Wellington peeps, you’re in luck too.

What to bring

  • A laptop with a development environment setup (1 or more languages)
  • A way to run tests
  • A collaborative mindset, all work is done in pairs
  • An open mind

Coderewhat?

If you haven’t heard of a Coderetreat before here’s a quick summary pulled from coderetreat.org/about.

Coderetreat is a day-long, intensive practice event, focusing on the fundamentals of software development and design. By providing developers the opportunity to take part in focused practice, away from the pressures of ‘getting things done’, the coderetreat format has proven itself to be a highly effective means of skill improvement. Practicing the basic principles of modular and object-oriented design, developers can improve their ability to write code that minimizes the cost of change over time.

Sound exhausting? It is. It’s also a great deal of fun!

If this sounds like you, please sign up and we’ll see you on the 8th!

Constructor Injection, and How It Simplifies Unit Test Setup

I’ve recently been reading Growing Object-Oriented Software Guided by Tests (GOOS), and one (of the many) aha moments was a piece of test code that mocked the collaborators and instantiated the object under test - all in the declaration of the test’s private fields. I am particularly fond of this approach for two reasons:

  • The test code setup is minimal and easily scanned
  • This approach encourages all required collaborators to be passed in through the constructor (aka constructor injection)

I’ve included an illustrative example below using Mockito, the actual test isn’t important but it proves this setup style works.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
import static org.mockito.BDDMockito.*;
import org.junit.Test;

public class ItemCheckerTest {
  
  private final ItemFetcher itemFetcher = mock(ItemFetcher.class);
  private final Notifier notifier = mock(Notifier.class);
  private final ItemChecker itemChecker = new ItemChecker(itemFetcher, notifier);
  
  @Test
  public void notifiesStoreManager() throws Exception {
      given(itemFetcher.fetch()).willReturn(new FetchedItem());
      
      itemChecker.check();
      
      verify(notifier).notifyStoreManager();
  }

  ...
}

For those unfamiliar with Mockito, the given call stubs a query, while the verify call uses a test spy to check a command call was made.

The important lines are the 3 private member variables of the test class, the first 2 use Mockito’s mock method to instantiate test doubles for our collaborators. The 3rd member variable (itemChecker) is the object under test, you will notice that it is instantiated with both of its required collaborators in the constructor. These 3 lines perform all the wiring we require for our test, without having to resort to @Before methods to set properties.

The reason we can leverage the member variables for this setup is that JUnit creates a new instance of ItemCheckerTest for each of the test methods (@Test). Providing each test with its own set of collaborators ensuring each test runs in isolation.

The most important side effect of setting up the test code in this fashion is that it promotes the use of constructors for wiring up collaborators. Using the constructor for collaborators has a couple of very appealing aspects:

  • It becomes impossible to create circular dependencies between your objects
  • Your objects are less prone to wiring bugs as they are upfront about their required collaborators.

Why would you want to be upfront about your collaborators, Steve Freeman & Nat Price (GOOS) have this to say:

Partially creating an object and then finishing it off by setting properties is brittle because the programmer has to remember to set all the dependencies. When the object changes to add new dependencies, the existing client code will still compile even though it no longer constructs a valid instance. At best this will cause a NullPointerException, at worst it will fail misleadingly.

Miško Hevery also has a great blog post on constructor vs setter injection.

From Zero to Headless Browser Tests in Jenkins

After spending a large portion of the day I can proudly say I have a working set of browser based tests that run on a headless Jenkins install. By headless I mean a server without any physical display installed, as is typical for server machines. This facilitates the execution of high level acceptance tests in much the same fashion as lower level unit and integration based tests, albeit at a slower rate.

The problem I am trying to solve here is a quick feedback loop on acceptance test level behaviour. This blog post will be talking about getting Cucumber scenarios running for a single browser (Firefox). Cross browser testing is a different problem, for which think Sauce Labs would be a better solution as they take the hassle out of provisioning and maintaining a wide range of operating system and browser combinations.

Outlined below are the steps I followed to go from installing Ubuntu server edition through to running the browser based tests (with Cucumber, Capybara, Selenium-Webdriver). You may find some steps are not required on your operating system or for the project you wish to test. Admittedly I dove down a few rabbit holes, but thanks to VirtualBox’s snapshot feature I could safely revert if things turned sour.

  • Installing Ubuntu
  • Installing Jenkins
  • Going Headless
  • Installing Ruby with RVM
  • Installing Firefox
  • Creating a Job
  • Bonus Points, watching the browser in realtime

Installing Ubuntu

If you have an existing server you can skip this step. If not grab yourself a copy of Ubuntu Server edition. As I wanted a simple way to play with Jenkins without provisioning hardware I used VirtualBox for virtualisation. I followed the usual VirtualBox installation, however once installed the server showed a blank screen on boot. This was fixed by following the workaround on the Ubuntu forums.

When installing, it is handy to enable the OpenSSH server so you can SSH onto the box from your desktop terminal, this makes copy & pasting some of the later steps much easier. To make the VirutalBox server visible on the network, change the network mode from NAT to bridged.

Installing Jenkins

Installing Jenkins is a breeze, as debian packages have been set up, check the wiki page for details.

1
2
3
4
5
$ wget -q -O - http://pkg.jenkins-ci.org/debian/jenkins-ci.org.key | sudo apt-key add -
$ sudo vi /etc/apt/sources.list.d/jenkins.list
Add "deb http://pkg.jenkins-ci.org/debian binary/"
$ sudo apt-get update
$ sudo apt-get install jenkins

This automatically creates an account called jenkins. We will need to login as this user later so set a password for jenkins with:

1
sudo passwd jenkins

You should now be able to view the Jenkins dashboard at http://your.server:8080/

Going Headless

Now that Jenkins is installed, we want to get a headless display configured for our browser based tests. First up hit Manage Jenkins > Manage Plugins > Available and install the Hudson Xvnc plugin (this works with Jenkins despite its name). Schedule Jenkins to restart to pickup the plugin. Once installed this gives us the ability to start a headless display automatically when we configure our jobs, more on that later.

With Jenkins configured we need to ensure the required software is installed on the server:

1
$ sudo apt-get install vnc4server

vncserver requires a password to be set before it can be used, this needs to be set before Jenkins can make use of the vncserver. For this we need to switch to the jenkins user and set a password.

1
2
3
4
$ sudo -Hiu jenkins
$ vncserver
Enter a password, and verify it
$ vncserver -kill :1 # or whichever display the vncserver output mentioned

When Jenkins runs it doesn’t need to know this password, but if you want to watch a running job you can connect to the running vnc session with that password and watch the tests in real time.

I initially headed down the Xvfb route but that seemed to require a lot of custom configuration in the job’s build script and isn’t related to the Xvnc plugin.

Installing Ruby with RVM

The job I’m wanting to run is a set of acceptance tests written in Cucumber with automation done using Capybara (Selenium-Webdriver under the hood). So its a Ruby job, and all good Ruby jobs use RVM. Fortunately RVM has a page on integrating with Hudson/Jenkins. I followed the recommended steps and installed RVM for a single user (jenkins).

1
2
3
$ sudo apt-get install curl bison build-essential zlib1g-dev libssl-dev libreadline5-dev libxml2-dev git-core
$ sudo -Hiu jenkins
$ bash < <(curl -s https://rvm.beginrescueend.com/install/rvm)

Once RVM is configured, run rvm notes to find the full list of dependencies you need to install for your required version of Ruby. e.g.

1
2
3
$ source ~/.rvm/scripts/rvm
$ rvm notes
$ sudo apt-get install build-essential bison openssl libreadline6 libreadline6-dev curl git-core zlib1g zlib1g-dev libssl-dev libyaml-dev libsqlite3-0 libsqlite3-dev sqlite3 libxml2-dev libxslt-dev autoconf libc6-dev ncurses-dev

Note that I didn’t give the jenkins user sudo rights, so I installed all packages through my usual admin account on the server.

RVM can be configured to allow the automatic installation of Ruby versions and gemsets by adding the following to ~/.rvmrc for the jenkins user:

1
2
3
rvm_install_on_use_flag=1
rvm_project_rvmrc=1
rvm_gemset_create_on_use_flag=1

Installing Firefox

Of course, a headless server isn’t any good without a browser to test

1
sudo apt-get install firefox

This is the default browser that selenium will select.

Creating a Job

At this point the Jenkins server should be fully configured to run headless jobs, so lets dive in and create one. Create a new freestyle job. Notice there is a new option available under the ‘Build Environment’ section call ‘Run Xvnc during build’, check this to have the plugin automatically do its magic.

For my example, I didn’t bother with checking projects out source control, I simply created a project in the /tmp directory. You’ll want to enable the appropriate SCM plugin and configure a checkout. Under the build section add an Execute shell step with the following:

1
2
3
4
5
6
#!/bin/bash -e
cd /tmp/selenium-test
source "$HOME/.rvm/scripts/rvm"
[[ -s ".rvmrc" ]] && source .rvmrc
bundle install
cucumber

The -e flag in #!/bin/bash -e ensures the script stops after any errors. You will notice that the script sources the .rvmrc file directly for the project, this ensures the correct version of Ruby is used with a gemset appropriate for your project. My .rvmrc looked something like:

1
rvm --create use ruby-1.9.2@selenium-test

Calling bundle install automatically installs bundler, reads the Gemfile.lock and installs all required gems. Finally cucumber kicks off the actual cucumber scenarios, and fingers crossed, they should pass with flying colours.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Feature:Headless
  In order to keep happy customers
  As a developer I want to ensure all my features continue to pass on CI

  Scenario: Headless browser
    When I check the nets without a head
    Then I the nets should be readable

1 scenario (1 passed)
2 steps (2 passed)
0m7.975s
Terminating xvnc.
$ vncserver -kill :33
Killing Xvnc4 process ID 6873
Finished: SUCCESS

Bonus Points, watching the browser in realtime

As the headless display is running in vncserver, you can connect to the vnc session and watch the tests run in real time. Just use your regular VNC client and connect to your.server:59xx where xx is the display number output on the Jenkins console for the running job. You will need to enter the password you set the first time you ran vncserver.

[Note: most/all of these instructions should work with Hudson also]

Specification by Example, a Review in Snippets

I’ve just finished reading Specification by Example, a new book (currently in MEAP) by Gojko Adzic. Highly recommend to anyone involved in software creation (developers, testers, product owners, product managers, …). The book includes great insights into building the ‘right thing’ and the ‘thing right’, drawing on experiences from many successful software projects and teams. These projects each have dedicated chapters providing real insight into how these teams work and the steps they took to get there.

But you don’t have to trust my vague recommendations, I’ve included a few key examples from the book below:

Like cheap wine, long paper documentation ages rapidly and leaves you with a bad headache if you try to use it a year after it was created.

Working from the outputs ensures that there is always something that the business users can provide feedback on.

Understanding why something is needed, and who needs it, is crucial to evaluating a suggested solution.

Solve technical difficulties in the automation layer. Do not try to solve them in the test specifications.

We automate specifications to get fast feedback, but our primary goal should create executable specifications that are easily accessible and human-readable…

When each team worked to deliver a whole feature end to end, it was much easier for business users to collaborate with the team to specify the conditions of satisfaction and engage in illustrating them with examples.

The biggest benefit from this is getting us to talk together so that we have a mutual understanding of the requirements. That’s more important than test automation.

Long term value comes from living documentation

The first chapter is free to get you started.

Vertical Slicing

Vertical slicing is one of the key features I see in being able to deliver features rapidly, with the code developed being clean and simple to extend to future requirements. The basic premise is that instead of dividing work into horizontal slices where a task is something like create a database schema. You instead slice your work into small user stories where each story is something that can be demonstrated to a stakeholder on its own. These stories will often touch multiple ‘layers’ of the application such as the view layer, service layer and persistence layer.

The benefits of this approach are plenty,

  • Each story delivers value, no matter how small.
  • Work in progress (WIP) is reduced as tasks such as ‘create a schema’ are not left floating around until the tasks that utilise them are completed.
  • You can focus your development effort on purely delivering just enough code to satisfy the story, without adding code you think you may need (YAGNI).
  • Tight feedback loop.
  • Reduced merge conflicts - your local code diverges from master for shorter spans.

Slicing stories vertically fits well with the Outside-In approach favoured by Behaviour Driven Development. Using an Outside-In approach you take a user story, build some acceptance criteria around it (potentially automated with a tool like Cucumber), then work from the outside in to complete the story. e.g In a web application I like to start with the view layer, building in enough of the view to satisfy the acceptance criteria. The view layer now defines the acceptance criteria for the layer beneath it, in this case the controller. This process continues with each layer creating a pull signal to write more code at a lower level. Development continues in a Test Driven style until the acceptance criteria for the story is met.

An example case I like to give for vertical slicing is for a search feature we implemented at f1000.com. I talk briefly about it in my talk on Acceptance Testing with Geb (39 minutes in). The search feature contains a number of different stories - developing the entire feature behind closed doors without a tight feedback loop would invariably result in building the wrong thing, even if it was what was originally asked for. Instead we took an iterative approach with thin vertical slices for each story.

The first cut of the search feature was designed to offer the greatest value, allowing users to search by keyword. The user could not go to the second page of results, change the sort order or any of the other features we ended up with. By getting a first cut out to our stakeholders early on they could immediately play with search and provide feedback on the direction. We continued in such a fashion adding support for more stories based on the importance, for example multi-domain search (evaluations, reports, faculty, blog) adds more value (and was done first) than being able to move to the next page in the results as the most relevant hits should appear on the first page.

Related reading: http://www.energizedwork.com/weblog/2005/05/slicing-cake.html

Acceptance Testing With Geb

Below are the slides and video of a presentation I gave at SkillsMatter on Acceptance Testing with Geb. Fast forward to 1:30 when I start the presentation.

This talk will cover the basics of using Geb to automate browser testing.

It will compare Geb with raw WebDriver/Selenium showing Geb’s expressive Groovy API.It will also demonstrate how to integrate Geb with acceptance testing frameworks, namely Cucumber via Cuke4Duke.

It also covers an experience report on how and why we transitioned from raw WebDriver to Geb and how existing WebDriver projects can be ported across to Geb with minimal initial effort due to its underlying use of WebDriver.

Cucumber, Maven & TeamCity

Having a suite of acceptances tests is all well and good, but if they don’t run regularly they tend to rot. Much like unit tests they should be part of your continuous integration configuration. We use TeamCity, so it was a logical choice for running our Cucumber scenarios.

Running Cucumber with Maven

As we predominantly use Java/Groovy all our Cucumber tests are written in Groovy and run automatically through Maven as part of the integration test phase. Checkout the Cuke4Duke project if you want to see more details about using Cucumber on the JVM. In particular there is a page dedicated to running Cuke4Duke with Maven. Once your pom has been configured its simply a matter of calling the integration phase:

1
2
mvn integration-test                              # Run all features/scenarios
mvn integration-test -DcukeArgs="--tags @search"  # Run only scenarios tagged with @search

Locally I have a bash script that lets me simply type cuke @search to achieve the same result.

1
2
#!/bin/bash
mvn integration-test -DcukeArgs="--tags $1"

Adding to TeamCity

TeamCity has built in support for running Maven goals, as such it’s relatively trivial to get the Cucumber scenarios running.

  1. Create a new build configuration, on the build step select Maven2 as the runner type.
  2. Set the goal to integration-test
  3. Skip most of the remaining fields.
  4. In the JVM command line parameters enter: -DcukeArgs="--strict -DcukeMaxHeapSize="-Xmx4000m" -DcukeMaxPermSize="-XX:MaxPermSize=2000m"

This should be all that is required to get the build running. --strict tells Cucumber not to guess when a step doesn’t exactly match a step definition (optional but recommended). The max heap size and max perm size are properties configured in the pom to boost the memory allocations. These are required by our build, but I have a sneaking suspicion it’s to do with the parsing of the step definition closures (Groovy specific).

Currently we have a dedicated desktop box (Win7) registered as a TeamCity agent, this allows Cucumber to run the scenarios through the browser (via Selenium/WebDriver).

Adding Reporting to TeamCity

As the build configuration stands, Cucumber will output using its normal coloured terminal format. So while you will be getting feedback on whether the build was successful from TeamCity, drilling in to see any failures will involve reading the raw Cucumber output. Fortunately Cucumber allows us to specify the output format we require, including JUnit XML reports.

  1. Update the JVM command line parameters to include --format junit --out target/junit.xml
  2. Add an Ant JUnit report type with the reports directory set to %system.teamcity.build.checkoutDir%/target/**/*.xml

You now get individual scenarios reported as tests in TeamCity, you can then monitor for long running tests, check stacktraces and get notified immediately via a TeamCity notifier as soon as a single scenario fails (no need to wait until the entire suite finishes).

Triggering Cucumber

Each time we deploy a new version of our application to the testing/dev box we want to automatically run the Cucumber tests. Deployment is handled via TeamCity so we have a single click to push the latest code onto the server. Once the application has been deployed the Cucumber build is automatically trigger, this can be set on the Build Triggering step of the configuration. Should the testing deployment not be manually trigger, the nightly build will push the latest code and run the Cucumber tests.

Key Features

The above screenshot shows the various builds we have related to our product. The Continuous Integration is our unit tests, these run quickly after every commit. Next we break the Cucumber build down into 3 discrete builds. Key Features is a tag (@keyfeature) we use against features and scenarios the business identifies as being the most important, the idea being that these are run first to give us quick feedback, in this case there are 75 key features which run in just under 10 minutes. The remaining scenarios are run in the non-key features build which runs automatically immediately after the key features have completed. These currently take just under 50 minutes. So the total feedback loop is roughly 1 hour.

Work in Progress (WIP)

The last Cucumber build is the WIP (work in progress) build, these are features and stories that are currently being worked on. TeamCity is configured slightly differently in this case, any scenarios that pass will fail the build. The idea being that either the scenario was written wrongly (it should fail first) or the scenario is now complete and the @wip tag should be removed. To make Cucumber fail when any scenarios pass we we need to pass a special flag --wip in the JVM parameters.

1
-DcukeArgs="--format junit --out target/junit.xml --tags @wip --wip"

The current value of 17 work in progress scenarios seems a bit high, ideally we’d like to eliminate waste by keeping our WIP down. Cucumber supports such limits as a command line argument.

Experience Report

If you’d like to hear more about Cucumber, Cuke4Duke, Groovy, Selenium & Geb I’m giving an experience report at SkillsMatter on the 26th of Jan.

Related reading

http://gojko.net/2010/01/01/bdd-in-net-with-cucumber-cuke4nuke-and-teamcity/

Failing Early in Bash

We use a bash script to automate deployment to our live servers, today it didn’t go so well. Drilling into the problem it appeared one of the copy commands failed due to permissions but the deployment continued anyway and we didn’t notice.

Bash provides a handy feature which will stop the script if any of the commands fail (return a non-zero value).

1
set -o errexit

Here you can see a simple example

1
2
3
4
set -o errexit  ## exit after any error (non-zero statement) 
echo before failing command
mkdir a/b
echo after failing command

Without this toggle enabled the script would have quite happily failed to create directory a/b then continued on with the rest of the script printing out after failing command. With the toggle enabled the script stops after it fails to create the directory.

1
2
before failing command
mkdir: cannot create directory `a/b': No such file or directory

Obviously in this trivial example it doesn’t seem important, but when an essential part of your deployment fails you want to know about it immediately.

Further reading available at: http://www.davidpashley.com/articles/writing-robust-shell-scripts.html

Checking Wordpress Images Are Present With Groovy

Below is a little script I knocked up today to ensure all uploaded images referenced in a Wordpress XML export are present on your server. A useful check when moving to a new Wordpress instance.

1
2
3
4
5
6
7
8
def imagePattern = ~'http://www.rapaul.com/wp-content/uploads/.+?/.+?/.+?(jpg|jpeg|png|gif)'
new File('./wordpress-export.xml').text.findAll(imagePattern).each { address ->
   try {
       new URL(address).openStream().close()
   } catch (FileNotFoundException e) {
       println "Could not find $address"
   }
}

I’m sure a bash guru could hack together something similar but Groovy really does make it simple.