Testing a legacy Java application with Groovy, Spock, Spring Test and Unitils

Why Groovy and Spock?

I was thinking how to retrofit legacy Java application with automated tests.

Why not use Groovy? It has a powerful syntax that allows do more with less coding. Fixture generation can be made easier – Groovy has a language level support for lists and maps. Objects with many attributes can be created using constructors with named parameters. Complex object hierarchies? – no problem, write your own builder or use an implementation that Groovy ships with (e.g. for handling XML). Closures will simplify test helper method creation (implementing template method pattern is trivial with a closure). And what is most important – all these features can be used on the existing Java code.

Next step was to choose a testing library.

I was planning to write integration tests interacting with database. The application made an extensive use of Spring IoC container and declarative transactions. Persistence was implemented with Hibernate and using Spring ORM module. So the natural choice is JUnit 4 with Spring Test framework (for application context injection and transaction management in test methods).

I wanted to write BDD style tests. My goal was to describe system features through specifications. It would give me a clear benefit – documentation and examples of the behaviour of the legacy code. I wanted to have tests integrated with Spring. So my choice fell on Spock (here you find a short and concise introduction to Spock).

Why Spock? You find strong arguments in the Spock wiki. Additionally, it integrates seamlessly with Spring and Spring Test. I liked the separation of four test phases phases through blocks (given, when, then and cleanup).

Spock and Spring Test test drive

To test Spock capabilities I created a very simple project based on the legacy system code. It uses Hibernate for persistence, Spring IoC and ORM. You can find the complete source code on GitHub:

https://github.com/mgryszko/blog-spock-spring-unitils

Spock integrates Spring as an extension. The documentation is short but to the point. Don’t forget to check examples from GitHub.

To use Spring Test annotations in your specification, first include spock-spring.jar on your classpath (e.g. as a Maven dependency):

<dependency>
    <groupId>org.spockframework</groupId>
    <artifactId>spock-spring</artifactId>
    <version>0.5-groovy-1.7</version>
    <scope>test</scope>
</dependency>

Then tell Spring what configuration files should be used to build the application context. This is done like in a normal JUnit 4 Spring test – just place the @ContextConfiguration on the specification. Spock will intercept specification lifecycle method calls (setup, cleanup, setupSpec, cleanupSpec) and delegate them to SpringTestContextManager. From now on you can use @Autowired and @Resource annotations to inject dependencies. To execute a feature method in a transaction, mark it with @Transactional, as if it was a JUnit test method.

But wait, this is an integration test and I want to populate the database with test data! Clearly DbUnit comes to mind. Let’s use it.

The easiest way to setup a persistent fixture is to use Spock Unitils extension. Unitils library provides a simple way to load DbUnit fixtures (and can also create Spring application context and inject Spring dependencies, what overlaps with the functionality offered by Spring Test).

First include spock-unitils.jar on your classpath (in my sample as a Maven dependency; I included explicitely unitils-dbunit to exclude some dependencies provided by other artifacts used in the project):

<dependency>
	<groupId>org.spockframework</groupId>
	<artifactId>spock-unitils</artifactId>
	<version>${spock.version}</version>
	<scope>test</scope>
</dependency>
<dependency>
	<groupId>org.unitils</groupId>
	<artifactId>unitils-dbunit</artifactId>
	<version>${unitils.version}</version>
	<scope>test</scope>
	<exclusions>
		<exclusion>
			<groupId>commons-logging</groupId>
			<artifactId>commons-logging</artifactId>
		</exclusion>
		<!-- Spock already includes JUnit -->
		<exclusion>
			<groupId>junit</groupId>
			<artifactId>junit</artifactId>
		</exclusion>
		<!-- due to Hibernate 3.2 dependency -->
		<exclusion>
			<groupId>org.unitils</groupId>
			<artifactId>unitils-dbmaintainer</artifactId>
		</exclusion>
		<!-- caused by DBUnit dependency -->
		<exclusion>
			<groupId>org.slf4j</groupId>
			<artifactId>slf4j-nop</artifactId>
		</exclusion>
	</exclusions>
</dependency>

Then, put @UnitilsSupport annotation on the specification to enable Unitils support. To populate the database with test data on every feature method, annotate your specification with @DataSet. By default, Unitils will look for a .xml datafile in the same directory (package) as the specification (net/gryszko/spock/dao/BankDaoSpec.xml).

The complete specification with Spring and Unitils features is listed above:

@ContextConfiguration(locations = ["classpath:/resources.xml"])
@UnitilsSupport
@DataSet
class BankDaoSpec extends Specification {

  @Autowired
  private BankDao dao

  @Transactional
  def "finds a bank by name"() {
    setup:
    def bankName = 'MBank'

    when:
    Bank bank = dao.findByName(bankName)

    then:
    bank.name == bankName
  }
}

Note on Spring and Unitils transaction syncronization

When Unitils (with DbUnit) and Spring are used together:

  1. transaction handling is disabled for Unitils in unitils.properties (DatabaseModule.Transactional.value.default=disabled)
  2. Unitils handles transactions using Spring PlatformTransactionManager

In the second case, there are two options of configuring the datasource for Spring and Unitils:

  1. Spring and Unitils use separate datasources. Unitils creates its datasource based on the configuration from unitils.properties. Spring datasource depends on the implementation used (in the simplest case it could be org.springframework.jdbc.datasource.SimpleDriverDataSource
  2. Spring and Unitils share the datasource. Unitils creates its datasource based on the configuration from unitils.properties. The datasource is used then in the application context through UnitilsDataSourceFactoryBean

What implications does the choice of datasource configuration have?

When performing a DbUnit operation and if Unitils transaction handling is disabled, default database and connection transactional settings will be used. In my example, it worked well both with HSQLDB and MySQL.

If Unitils transaction handling is enabled and two separate datasources are configured for Unitils and Spring, then PlatformTransactionManager synchronizes on two datasources (first datasource is used for DbUnit operations, second one for the application persistence). It results it two parallel transactions, what works well for MySQL, but not for HSQLDB (deadlock).

If Unitils and Spring share the same datasource, then PlatformTransactionManager synchronizes on it. A transaction is started for the test method. When DbUnit operation is going to be performed, the transaction is suspended and a new one is started. After finishing the DbUnit operation, the original transaction (for the test method) is resumed. Works well both for MySQL and HSQLDB.

Notice that even if Spring uses HibernateTransactionManager and Unitils DataSourceTransactionManager, they synchronize on the same resource (datasource) via TransactionSynchronizationManager.

Wrap up

Writing tests in Groovy makes a lot of fun. With Spock they are well structured and more human readable than pure JUnit tests. There are already a lot of extensions. As of 0.5 version, it integrates with Maven and Gradle build tools and Grails, Guice, Spring, Tapestry and Unitils frameworks.

For a quick DbUnit integration, Unitils meets its expectations. For a more sophisticated scenario, it has some drawbacks: you cannot define shared database fixtures (dataset is loaded on every test method) and you have to be careful with the transaction configuration (in your application and for Unitils). Depending on the application and database configuration your test can be deadlocked.

A solution to these shortcomings would be a custom @DataSet annotation working with a DbUnit test execution listener and participating in application transactions. But this is a subject for another blog post…

2 thoughts on “Testing a legacy Java application with Groovy, Spock, Spring Test and Unitils

  1. Hi Marcin

    Thanks for the post. I’m interested in your last couple of sentences. Firstly you say that the tests can become deadlocked – do you mean if you are running multiple tests concurrently?

    You mention that you can’t define shared database fixtures – and I’ve found this problem – either the data is loaded for every method, or only some methods, in a class.

    Finally you hint that you might show a custom @DataSet annotation – any progress on this? đŸ™‚

    I think it would be nice to have DBUnit configured with some Groovy DSL rather than XML, but I don’t think anybody’s done this yet. I may look into this…

    1. Hi Jonny,

      I meant that transactions started from a single test method can become deadlocked (there are 2 of them).

      Custom @DataSet annotation for Spring Test – no progress yet đŸ˜¦

      Groovy DSL for DBUnit – I had the same idea but never tried to implement it. Write me an email so we can talk about it.

Comments are closed.