17 December 2009

Running parameterized JUnit tests in parallel

We use JUnit 4 for Anaconda, not just for unit tests, but also for integration or system tests. Typically, we iterate over all features of a given class in a database and validate each feature.

Note: I'm using the term feature in the sense of map feature, not in the sense of application feature or implemented requirement.

A simple pattern for such tests is

public class FeatureTest
{ 
    @Test
    public void testAllFeatures()
    {
        for (Feature feature : findAllFeatures())
        {
            testOneFeature(feature);
        }
    }

    private void testOneFeature(Feature feature)
    {
        // some logic with one or more JUnit assertions
    }
}

Obviously, this naive approach has the following drawbacks:
  • The test fails and terminates on the first incorrect feature. The remaining features will not be tested.
  • All features get tested sequentially. This may take awfully long for a large database.
JUnit 4 has a specialized runner Parameterized for running all tests in a given class with different parameters from a given list of parameter sets. An instance of the test class is created for each parameter set, and the parameters are passed to the constructor via reflection:

@RunWith(Parameterized.class)
public class FeatureTest
{
    // This is the parameter for each instance of the test.
    private Feature feature;

    public FeatureTest(Feature feature)
    {
         this.feature = feature;
    }

    @Parameters
    public static Collection<Object[]> getParameters()     
    {         
        List<Feature> features = findAllFeatures();
        List<Object> parameters = new ArrayList<Object[]>(features.size());
        for (Feature feature : features)         
        {
             parameters.add(new Object[] { feature };
        }
        return parameters;
    }
 
    @Test
    public void testOneFeature()     
    {
        // assertions acting on the feature member
    }
} 

This solves the first problem: Each feature gets tested in its own test instance. Now if there is a large number of features or if each individual test is very expensive, we would like to run the test instances in parallel, using a thread pool, or maybe even a grid of multiple computers.

Browsing through the JUnit sources, I found a surprisingly easy way of parallelizing the tests with a thread pool, simply by using a custom runner:


@RunWith(Parallelized.class)
public class FeatureTest
{
   // same class body as above
}


All you need is a simple extension of the Parameterized runner:

public class Parallelized extends Parameterized
{
    
    private static class ThreadPoolScheduler implements RunnerScheduler
    {
        private ExecutorService executor; 
        
        public ThreadPoolScheduler()
        {
            String threads = System.getProperty("junit.parallel.threads", "16");
            int numThreads = Integer.parseInt(threads);
            executor = Executors.newFixedThreadPool(numThreads);
        }
        
        @Override
        public void finished()
        {
            executor.shutdown();
            try
            {
                executor.awaitTermination(10, TimeUnit.MINUTES);
            }
            catch (InterruptedException exc)
            {
                throw new RuntimeException(exc);
            }
        }

        @Override
        public void schedule(Runnable childStatement)
        {
            executor.submit(childStatement);
        }
    }

    public Parallelized(Class klass) throws Throwable
    {
        super(klass);
        setScheduler(new ThreadPoolScheduler());
    }
}


The RunnerScheduler interface is fairly new in JUnit and marked as experimental. I discovered it in the current version JUnit 4.8.1 and found it missing in JUnit 4.4.0 which we have been using so far. RunnerScheduler is also available in JUnit 4.7.0, but I did not check if this is the earliest version.

04 October 2009

Eclipse Forms and Data Binding

Our map compiler Anaconda reads all parameters and settings from a configuration file, which over time has evolved from a simple Java properties file to a not-so-simple XML file which is validated by an XML schema.

To access the configuration at runtime, we use Java XML Bindings generated from our schema by xjc in a straightforward manner, without any fancy customizations.

Using the XML editing support in Eclipse, it is very easy to edit and validate a configuration file, at least from a developer perspective. However, our customers will not be too happy about editing large XML files by hand, so the idea is to develop a form-based configuration editor for our RCP application Anaconda Workbench, similar to the manifest editor of Eclipse PDE or the POM editor of m2eclipse.

These editors are based on Eclipse UI forms, another layer on top of SWT and JFace, which is obviously powerful enough for complex tasks, as demonstrated by the above examples. Less obviously, it is rather poorly documented. The Eclipse online help has just 10 brief pages about UI Forms and the Javadocs which, as usual, are not very useful for getting started.

The Eclipe Rich Client Platform book also has not more than one page on UI Forms and a link to an online article from 2004. On the Eclipse site, there are two more recent articles
Looking for futher tutorials, I came across Marco van Meegen's critical review Eclipse Forms im Härtetest. I decided to make up my own mind, but after implementing a few examples with Eclipse Forms, I largely agree to his criticism: the API forces you to write lots of repetitive code and it is not easy to figure out how to wire up the different classes to do your job.

To alleviate the shortcomings of Eclipse Forms, Marco created yet another layer called RCPForms. So I gave it a try, and I found it a lot easier to use than working with UI Forms directly. I had to use the sources from the Subversion trunk at Sourceforge, the older tagged or released versions do not seem to work with Eclipse 3.5.

Eclipse forms are usually wrapped by a ManagedForm which manages the state of the form parts and the underlying data models.

The form parts and the models can vary independently, and one model can be shared by multiple form parts.

A managed form is dirty, when one of its parts is more recent than the underlying model. Conversely, it is stale, when a change in the underlying model is not yet reflected on the UI.

To handle this form lifecycle, RCPForms expects the model to support PropertyChangeListeners, which again requires you to add some boilerplate code to your model beans. For my example with a JAXB model, I managed to tweak xjc to generate the required listeners, which is to be discussed in detail in a separate article.

Here is a screenshot of a simple example:


The warnings result from validators on missing mandatory fields.

13 September 2009

Eclipse RCP Application Branding

A while ago, I figured out how to use the branding features in Eclipse to make the About dialog in my Anaconda Workbench show all the included features with the appropriate icons and other information. This was based on Eclipse 3.4, and here is a screenshot of the results:



Clicking on Feature Details, you get a list of all branded features included in the product:


Not every feature is branded by default. You have to associate a branding plug-in to a given feature and include an about.ini file and an optional icon in the branding plug-in. I won't go into details for this Eclipse 3.4 example, seeing that things have changed for Eclipse 3.5, and the rest of this post relates to Eclipse 3.5.

What's new in 3.5




The Feature, Plug-ins and Configuration Details buttons have gone, now there is just a single Installation Details button. To see the installed features, click on one of the feature icons to open a dialog listing all features branded with the same icon.



(Actually, I'm not quite sure whether or not the icon name is the grouping criterion, but I guess so from my experiments.)

Selecting a feature from the list and clicking on Plug-in Details, you can display all plug-ins contained in the feature.

Things to note


Eclipse Help has just two rather brief topics on Product Branding and Customizing a product. Branding a product is straightforward if you use the Product Configuration Editor as explained in the first topic. Things get more complicated when your product contains multiple features or even third-party products.

For each feature to branded, you need a branding plug-in containing the following resources:

  • about.ini
  • about.properties (optional)
  • a 32*32 icon
Make sure to include these resources in you build.properties so they will be exported to your plug-in JAR.

In about.ini, set the following two properties:

aboutText=Anaconda Workbench for Car Navigation Databases
featureImage=icons/anaconda32.png

You can use the optional about.properties to work with language dependent variables in about.ini. Set featureImage to the relative path of your icon resource.

By default, Eclipse assumes that the branding plugin for a feature com.acme.foo is also named com.acme.foo. If you want to assign a different name, you have to explicitly declare the branding plug-in in the feature editor.

I ran into a conflict between feature branding and product branding: My top-level product Anaconda Workbench contains another product udigLite, which in turn contains Geotools, Eclipse EMF and Eclipse RCP.

udigLite is a modified subset of uDig. uDig has a top-level feature named net.refractions.udig which a contains a branding plug-in of the same name with a product definition.

udigLite originally had a top-level feature net.refractions.udig.lite containing just a subset of the uDig plug-ins, including the net.refractions.udig plug-in which cannot be left out as it contains more than just branding information.

With this configuration, I ended up with two uDig features in the feature list: net.refractions.udig.lite as expected, and net.refractions.udig, although no such feature exists in my product.

It seems that a branding plug-in containing a product definition is automatically treated as a feature of the same name. I wonder if that's a bug or a feature (pun intended).

Anyway, to remove this duplicate, I dropped my net.refractions.udig.lite feature and changed the original net.refractions.udig feature to include just the required subset of plug-ins.

Another problem I ran into is bug 280186 or bug 289300, causing each feature to be listed twice in the exported product. To supress that, you can drop the org.eclipse.equinox.ds plug-in from your product, provided your application does not depend on Declarative Services.

07 April 2009

Arrows with Styled Layer Descriptor

In our uDig plug-ins for the Anaconda Workbench, we use Styled Layer Descriptors (SLD) for the map styles.

One of the features we are currently working on is the direction attributes for one-way streets. Of course you want to visualize them on the map, but SLD does not directly support arrow decorations for lines.

I asked for advice on the uDig mailing list and got the answer in no time. You can either use an arrow character from a suitable font in a Text Symbolizer, or a clever combination of Line Symbolizers with different styles of dashed lines to emulate arrows.

Using an SLD snippet from the Geoserver blog, I got the following result:

[Sorry, screenshot deleted to avoid potential licence issues.]

02 April 2009

Getting Started with OSGi Declarative Services

OSGi Declarative Services have been on my mental agenda for a while, and now I've started using them. I'm not going to write yet another tutorial on the subject, but there are some bits and pieces I could not find anywhere else that may be worth mentioning.

To get an overview of Declarative Services, have a look at Neil Bartlett's tutorial on EclipseZone, or at his more recent EclipseCon presentation. The former is based on Eclipse 3.2.2, the latter on the latest Eclipse 3.5 milestones, so there are some differences to the current release Eclipse 3.4.2. In particular, Eclipse 3.5 supports a newer version of the Declarative Services Specification (1.1 vs. 1.0 in Eclipse 3.4).

All of the following refers to Eclipse 3.4.2. For using Declarative Services with Equinox, you need the following bundles:
  • org.eclipse.osgi
  • org.eclipse.osgi.services
  • org.eclipse.equinox.ds
  • org.eclipse.equinox.util
The last two are not included in standard Eclipse distributions. Download them from the Equinox download area, or use the Eclipse update manager to install Equinox Bundles.

Follow the tutorials to implement a simple example. If you cannot figure out why your component does not start, enable console logging for the Service Component Runtime in your OSGi launcher by setting the following system properties.
  • equinox.ds.debug=true
  • equinox.ds.print=true
Make sure to include the OSGI-INF folder with your component descriptors into the binary build on the Build tab of the Eclipe manifest editor. This amounts to adding the folder to the bin.includes property in the build.properties.

Add a header Service-Component: OSGI-INF/components.xml to your manifest. (Choose any other filename if you like.)

Do not forget to specify the correct XML namespace http://www.osgi.org/xmlns/scr/v1.0.0 in your XML documents, or else your component definitions will not be recognized.

Seeing that Eclipse 3.4 has no graphical editors for the component descriptors, it is useful to install the Eclipse XML editor features and to work with XML validation on the component descriptors.

If your bundle contains more than one component, you can have one XML document per component and list them all in the manifest header, or you can have a single XML document containing all component descriptions, which is what I prefer.

To enable XML validation on a multi-component descriptor, write your own XML schema where the root element contains a sequence of 1..n component descriptors.

Here is an XML schema for this purpose:


<?xml version="1.0" encoding="UTF-8"?>
<schema xmlns="http://www.w3.org/2001/XMLSchema"
xmlns:scr="http://www.osgi.org/xmlns/scr/v1.0.0"
targetNamespace="http://www.harmanbecker.com/anaconda/components"
xmlns:tns="http://www.example.org/components"
elementFormDefault="qualified">

<import namespace="http://www.osgi.org/xmlns/scr/v1.0.0"/>

<element name="components">
<complexType>
<sequence>
<element ref="scr:component" maxOccurs="unbounded"/>
</sequence>
</complexType>
</element>
</schema>


Using this schema, a component descriptor for two components might look like this:

<?xml version="1.0" encoding="UTF-8"?>
<c:components xmlns:scr="http://www.osgi.org/xmlns/scr/v1.0.0"
xmlns:c="http://www.harmanbecker.com/anaconda/components">


<scr:component name="foo">
<implementation class="com.acme.foo.FooImpl"/>
<service>
<provide interface="com.acme.foo.Foo"/>
</service>
</scr:component>

<scr:component name="bar">
<implementation class="com.acme.foo.BarImpl"/>
<service>
<provide interface="com.acme.bar.Bar"/>
</service>
</scr:component>

</c:components>

01 April 2009

JDBC Drivers in OSGi

How do you create a JDBC connection for a given JDBC URL in an OSGi application? In particular, how do you avoid an explicit dependency of your application code on a given driver?

In a plain old classpath context, you would invoke DriverManager.getConnection(), maybe after loading the driver class using Class.forName(), if your driver does not support the Service Provider mechanism which is mandatory for JDBC 4.0.

The trouble is, even if your driver does provide the META-INF/services metadata, this does not work in OSGi, since DriverManager creates a number of class loader problems by using Class.forName() internally.

I'm working with a variant of the Zentus driver for SQLite, which I've modified slightly to build an OSGi bundle under Java 1.6. I added the service metadata, and indeed my application code now works without loading the driver via Class.forName() - as long as the code is running outside of OSGi.

There are the following issues:
  1. The context class loader of the current thread is used to scan for META-INF/services resources and to load the referenced classes. This happens during initialization of DriverManager, so it will not catch resources from bundles which are not visible to the current context.

  2. Additional drivers can register with DriverManager, they should do so in static initalization code, so that the driver gets registered when someone calls Class.forName("my.own.Driver").

  3. DriverManager.getConnection() iterates over the registered drivers until it finds one that can process the given URL. Unfortunately, it does a fatal double check to see if the class of that driver is the same that would be obtained by the caller. This amounts to calling Class.forName("some.matching.Driver", ccl) where ccl is the caller's class loader.
(To get some information on what is happening, it is useful to call DriverManager.setLogStream(System.out).)

The third point means that DriverManager.getConnection() will always fail if the driver class is not visible to your bundle, so there really is no way to avoid a dependency.

Here is an outline of a partial solution:
  • Create an OSGi bundle for each JDBC driver you want to use. Add a bundle activator that calls Class.forName("some.jdbc.Driver"). This will register the driver with DriverManager when the bundle is started.
  • For each bundle that needs to create connections using DriverManager.getConnection(), add optional dependencies on the JDBC drivers packages for all drivers that you are planning to use.
This is rather ugly, because you need to edit your client bundle manifest whenever you want to add support for an additional driver. But at least you can deploy your system with any subset of the defined drivers, maybe with just a single driver.

If you do not need to support legacy code, do not bother using DriverManager and the Service Provider mechanism. After all, this is just a poor man's service registry, and you are much better off using the OSGi Service Registry.

I'll give an example in one of my next posts.

29 March 2009

Maven Plugin for Eclipse Source Bundles

The Maven Source Plugin creates a JAR containing the sources of a project and attaches it to the main artifact of the project.

It would be nice if you could directly use this JAR as source attachment for a bundle in your Eclipse target platform, but Eclipse 3.4 expects a special Source Bundle format with specific headers in the manifest. There is a feature request to support this directly in the Maven Source Plugin.

To fill this gap, I had a look at the sources of the Maven Source and Archiver plugins and wrote my own maven-sourcebundle-plugin. This plugin is available for public use as a subproject of the DataScript project hosted at BerliOS.

There is no public Maven repository for this project yet, but it is contained in the latest binary release rds-bin-0.30, or you can get the sources from Subversion.

3D Rendering in uDig

I've started experimenting with Java3D rendering in uDig. The goal is to use Java3D as a rendering engine first for plain old 2D data, and then for real 3D data in the next step.

More details can be found in the uDig wiki.

25 March 2009

Java3D and Eclipse

The world is not flat, and 3D maps is one of the latest hypes in navigation systems. (Is anybody using them for real, I wonder....)

So there is a need for me to deal with 3D city and terrain models in my applications, and of course the first thing you want is a 3D viewer. I started experimenting with Java3D a few months ago, and it felt such a relief compared to OpenGL programming in C. The only thing that gave me a hard time is the lack of documentation in some areas, but in the end I could answer most of my questions by studying the source code or by trial and error.

At first, I wrote a couple of stand-alone viewers with a Canvas3D embedded in an AWT Frame, but what I really need is Java3D embedded in an Eclipse view or editor.

I found a couple of postings on Java3D with SWT which sounded rather discouraging, but in the end it was rather straightforward. The java3d-eclipse project on Sourceforge gave me some useful hints, but I cannot recommend using it, since the way that it re-bundles the Java3D JARs is definitely not OSGi-compliant.

Now here is my own recipe:
  • Get the sources for Java3D 1.5.2 from https://java3d.dev.java.net. There is no source package, you have to use CVS. You need three subprojects: j3d-core, j3d-core-utils and vecmath.
  • Compile vecmath into a stand-alone OSGi bundle.
  • Follow the Java3D build instructions to build j3d-core and j3d-utils in one go, including some native library compilation and Java code generation.
  • j3d-core and j3d-core-utils mutually depend on each other and cannot be compiled separately, so I really don't understand why Sun created two libraries instead of one. Anyway, to be friendly to Maven and OSGi, I copied the original sources, the generated sources and the native library binaries to just one Maven project, created a POM and built my Java3D OSGi bundle using the maven-bundle-plugin.
  • I did this both for Windows and Linux and managed to launch the HelloUniverse demo from Equinox.
  • For cross-platform use, move the platform specific code (both the native libs and a handful of Java classes) to separate projects and build OSGi fragments for these, similar to SWT.
After these preparations, embedding a Canvas3D into an Eclipse view or editor is as easy as this:

public void createPartControl(Composite parent)
{
awtContainer = new Composite(parent, SWT.EMBEDDED);

Frame frame = SWT_AWT.new_Frame(awtContainer);
frame.setLayout(new BorderLayout());

GraphicsConfiguration config = SimpleUniverse
.getPreferredConfiguration();

Canvas3D canvas = new Canvas3D(config);

createUniverse(canvas);
frame.add(canvas, BorderLayout.CENTER);
}

19 March 2009

Anaconda: A New Architecture for Compiling Navigation Databases

In my previous post, I presented the Anaconda Workbench, without explaining about Anaconda. This is the code name for my current project, a Map Compiler for processing map data into a compact binary database for car navigation systems of Harman/Becker.

I gave a talk about Anaconda at the Eclipse Demo Camp in Hamburg in November last year. The slides of this talk explain the purpose of a Map Compiler and our usage of OSGi and Eclipse and lots of other fabulous Open Source components.

18 March 2009

Anaconda Workbench: A uDig Application

Now here is a glimpse at what we are using uDig for. I think all the digging in the internals of several Open Source projects is now beginning to pay off.

The Anaconda Workbench is an application for viewing and testing car navigation databases in the forthcoming Navigation Data Standard format (NDS). (This is a closed source project of Harman/Becker, so I cannot offer more than some general information here.)

We implemented a Geotools extension for the NDS format, and a corresponding uDig catalog plug-in. This was enough to build an NDS map viewer. The map has multiple levels of detail in separate layers. We use SLD for styling the layers and for activating the appropriate layer depending on the current map scale.

On top of the map viewer functionality, there is a name browser which allows you to select road names and display the road on the map or use it as a start or destination for a route.

There is also a route calculator plug-in which finds the shortest route between two points and highlights the route in a separate layer.

More functionality will be added step by step. In the end, the Anaconda Workbench shall become a front-end for our map compilation process and not just a viewer for the compiled map databases.

udigLite: An OSGi-friendly subset of uDig

At long last, my team and I have succeeded in creating a subset of uDig which suits our environment. The baby is called udigLite, and the first binary releases can be downloaded from BerliOS, the site also hosting our Mercurial clones of the original Subversion source repositories.

Read the whole story in the uDig wiki.