Writing tests for Vala

There has been a long outstanding issue to get a unit test environment up and running for Diodon which is written in Vala. Finally, I have found a suitable solution. In this blog post I want to talk about my experience. Hope this helps you to get your Vala code tested as well. Comments are welcome.

When I started my research on different vala unit test framework I found quite a few. Amongst them were vala-gtest-module and valadate.

Those frameworks looked all quite promising. Unfortunately, all do not seem to be maintained anymore and are not available as official package in any distribution repository (Note: It seems that Yorba has adopted valadate. So there is still hope… ;)).
Those issues make it very difficult to use these frameworks, as it would mean to run tests, the test framework library needs to be built from source as well. And as the code base of such is quite old, it might happen that it does not even build with the current vala version.

So I had to look for another solution. Doing so I came across GTest which is included in glib. This has the great advantage that I would not even need another dependency to get my vala code tested. And luckily the developers behind libgee created a very useful TestCase class based on GTest so to use the GTest api is not as cumbersome.

Let’s take a look at this class and see how we can use it (note that the class is licensed under LGPL and written by Julien Peeters, see original source code):

public abstract class Gee.TestCase : Object {

	private GLib.TestSuite suite;
	private Adaptor[] adaptors = new Adaptor[0];

	public delegate void TestMethod ();

	public TestCase (string name) {
		this.suite = new GLib.TestSuite (name);
	}

	public void add_test (string name, owned TestMethod test) {
		var adaptor = new Adaptor (name, (owned)test, this);
		this.adaptors += adaptor;

		this.suite.add (new GLib.TestCase (adaptor.name,
		                                   adaptor.set_up,
		                                   adaptor.run,
		                                   adaptor.tear_down ));
	}

	public virtual void set_up () {
	}

	public virtual void tear_down () {
	}

	public GLib.TestSuite get_suite () {
		return this.suite;
	}

	private class Adaptor {

		public string name { get; private set; }
		private TestMethod test;
		private TestCase test_case;

		public Adaptor (string name,
		                owned TestMethod test,
		                TestCase test_case) {
			this.name = name;
			this.test = (owned)test;
			this.test_case = test_case;
		}

		public void set_up (void* fixture) {
			this.test_case.set_up ();
		}

		public void run (void* fixture) {
			this.test ();
		}

		public void tear_down (void* fixture) {
			this.test_case.tear_down ();
		}
	}
}

What we can do now is to implement a test case class like the following:

class TestExample : TestCase {

  public TestExample() {
    // assign a name for this class
    base("TestExample");
    // add test methods
    add_test("test_example", test_example);
   }

   public override void set_up () {
     // setup your test
   }

   public void test_example() {
     // add your expressions
     // assert(expression);
   }

   public override void tear_down () {
     // tear down your test
   }
}

There are no naming convention here, how a test class and its test methods are called. However I prefer to name a test class the same as the class its testing + Test as prefix. For example the class above TestExample tests the functionality of the class Example. I do the same for test methods. So this means that the method test_example above tests the method example on the class Example.

This just as a note. Let’s go back to setting up our testing environment. So to be able to run the test we have just created, we need to compile all test classes into one executable with the following main method:

public static int main(string[] args) {
  Test.init (ref args);
  // add any of your test cases here
  TestSuite.get_root().add_suite(new TestExample().get_suite());
  return Test.run ();
}

Of course, you can create this executable by using your preferred build tool. In this post I just wanted to show you how this can be done with waf which I am currently using to build diodon.

So a wscript_build of a tests folder where all test cases are located could look like this:

prog = bld.new_task_gen (
  features = 'c cprogram test',
  target = 'example-test',
  source = bld.path.ant_glob (incl='**/*.vala'))

You probably noted that there is a feature test configured. This is a feature to run tests while building the executable with waf. I will come back to this later.

Let’s just make the assumption that this is not configured. This way we would now have an executable example-test which we could simply run in a terminal.

The output could look like this:

/TestExample/test_example: OK

Running a test executable would now always run all test cases. However this might take quite a while considering you wrote thousands of tests ;). For this reason, there is a tool called gtester included in glib which has some nice features such as running only a specific test case.

So let’s say we only want to run the test TestExample, we could use the following command:

gtester --verbose -p /TestExample example-test

gtester has many more options. Best take a look at its documentation.

It’s all great that we can now run our written tests manually to see whether our changes broke anything. However, it would be even better if such would be run every time we built our source code. And when a test fails that the build will fail as well (of course there should always be an option to disable running tests). This has the advantage when someone builds or packages your vala code on a different system with different dependencies, that errors the user might have with your code, already occur while building respectively running the tests.

So to do this with waf I want to come back to the test feature I have mentioned above. Beside configuring this feature we have to adjust the main wscript which could look like the following:

...
def options(opt):
  ...
  opt.add_option('--skiptests', action='store_true', default=False,
    dest='skiptests', help='Skip unit tests')
  ...

def build(ctx):
  ...
  if not Options.options.skiptests:
        ctx.add_subdirs('tests')
  ...
  ctx.add_post_fun(post)

  # to execute all tests:
  # $ waf --alltests
  # to set this behaviour permanenly:    
  ctx.options.all_tests = True

def post(ctx):
  waf_unit_test.summary(ctx)

  # Tests have to pass
  lst = getattr(ctx, 'utest_results', [])
  if lst:
    tfail = len([x for x in lst if x[1]])
    if tfail:
      ctx.fatal("Some test failed.")

When ./waf build is now called, the tests will be run as well. It is also possible to skip tests by simply using ./waf build –skiptests.

You can find a full fledge sample in the diodon source code.

If you use automake I would recommend you take a look at the libgee source code which have implemented a similar approach with automake.

This is it. Hope this was helpful. And do not forget: Test more!!! ;).

3 thoughts on “Writing tests for Vala

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.