About Drools and Infinite Execution Loops

Esteban's Blog

Infinite recursion is a typical problem when we are working with Drools (and possibly any other rule engine). The inference mechanism going on inside Drools requires rules to be re-evaluated and executed as needed. This powerful feature about rule engines could be really beneficial some times, but some other times it could be a really PITA. I’ll try to analyse in this post the origins of this recursion and the most common ways we have to deal with them in Drools: control fields, control facts, rule attributes and Fine grained property change listeners.

View original post 3,184 more words


MEAN installing prerequisites

Looking to learn something new I thought I would install the MEAN stack and try and write some code. I use both a Windows laptop and a Macbook so thought I would install on both and see how it goes.

What is MEAN?

MEAN stands for MongoDB, Express, Angular JS and Node.js. Using these four technologies you can write a web application using JavaScript for both client and server. Plus with Mongo you can store JSON straight in the database.

What are the prerequisites?

To get started with MEAN development you need to make sure the following are installed:

  • Git
  • Node.js
  • MongoDB

Everything else can be installed using the npm package manager that comes with Node.js.

Installing on a Mac

To help with installing I would recommend first installing a package manager called homebrew. It makes installing, updating and uninstalling packages a breeze. For up to date instructions I would visit the homebrew site, but to give you an idea how easy it is to install these are the current instructions.

Simply paste the following into a terminal prompt and press enter.

/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

Next install Git

brew install git

And install mongo

brew install mongodb

Mongo will by default use a data library of /data/db so you either need to create this directory or override the default to a different one. Let’s just go ahead and create the default.

sudo mkdir -p /data/db

Next we need to change ownership of this directory so that mongo can use it.

sudo chown -R $USER /data/db

Details can be found on the mongo site.

Finally install node

This was the only step where I had some difficulty. It is recommended to use a version manager that allows multiple versions of node to be installed and you can switch between them in the terminal. The version manager I am using is NVM and the instructions I was using said to install NVM using ‘brew install nvm’. I found I couldn’t get this to work properly and when I checked the actual NVM github page it specifically says not to use homebrew. Fortunately it’s pretty easy to install anyway (for up to date instructions visit the NVM site).

Simply paste the following in a terminal and press enter:

curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.32.0/install.sh | bash

Restart the terminal to make sure NVM is started and install a version of node.

nvm install node

That’s all you need to begin with but you can see on the NVM site how to install additional versions, switch between them, set a default etc.

Installing on Windows

First up all three of these tools are designed to run on *nix. You can run these on Windows but it’s fair to say it’s not a natural home so you have to do a bit of extra work.

Install git

There are different ways to work with git on Windows but if you want to access the full range of commands then it pays to have a bash shell for Windows where you can run git on the command line. To do this I use git for Windows. Download from the following website and run the installer.


I’ve used this tool for a few years now and it is my go to shell for development if I want to run commands such as git or maven since I prefer the bash shell to Windows. However, recently with Windows 10 and the latest git bash shell I’ve had problems where it doesn’t pick up certain keystrokes. In particular the arrow keys don’t work so you can’t scroll through options. I managed to find a workaround on stackoverflow where you can launch git bash from within the Windows shell. E.g.

cmd.exe /C "C:\Program Files\Git\bin\bash.exe" --login -i

Install mongo

If you’re used to working with a database like MySQL then you have probably used the graphical installer which makes installing a breeze. Mongo lags behind a bit in this regard and getting setup is a much more manual process. However, there is a nice tutorial on the Mongo site which explains what to do.


The tutorial runs through how to install Mongo, set the directory where the data lives and create a Windows service to be able to Start/Stop MongoDB.

Install node

The Windows installer for node.js can be downloaded from the node site.


I have found that some node modules also need python to install properly. On a mac this isn’t a problem as python 2.7 comes pre-installed but on Windows you’ll need to install python separately.

The .msi installer for python can be downloaded from their website, I would install 2.7. If you’re not familiar with python there is a newer 3.5 version but it is not backwards compatible and most people still use 2.7!



I’m usually happy developing on either my Windows laptop or MacBook but I have to say that node development for me is much better suited to the mac. Everything installed easily and was a joy to use on the mac. Whereas on Windows I have wasted hours frustratingly having to search for workarounds to problems.

If you’re stuck on Windows then I’m not saying you have to buy a mac especially but if you have the choice then I would thoroughly recommend using a mac.

Setting the JVM character encoding on the AS400

When running a Java application (on any system) the JVM can be started with different character encodings. Typically it will pickup the system default based on the OS and locale system settings. For an excellent introduction to character encodings I can highly recommend Joel Spolsky’s article.

I recently encountered a problem with this on an AS400 where the Java application was trying to write a file out to the IFS file system. The filename in question contained Polish characters, or at least it was supposed to.

You can see the encoding in use by looking at the JVM properties against the job. Use WRKJOB to work with the job and take option 45 to Work with Java virtual machine. There are a couple of places you can then check.

  • Option 2 will show the system environment variables that were used to initialize the JVM
  • Option 7 will show the current Java system properties that are in use by the JVM

Searching online I saw many sources claiming that if you set the JVM argument file.encoding that this will set the default character encoding. On the AS400 this can be achieved by navigating to the SystemDefault.properties file and adding the line:


Note: SystemDefault.properties can either reside in /QIBM/UserData/Java400 or in the user home directory for the user starting the JVM.

This may work when it comes to writing out the contents of a file but it has no effect on the encoding used to read or write filenames.

To properly influence the encoding used to initialize the JVM you have to set environment variables for the locale. In the Linux/Unix world this can be done with LC_ALL environment variable which can be set to something like en_US.UTF-8.

The AS400 isn’t a *nix platform so LC_ALL does not apply and the JVM is an IBM platform specific implementation. By looking at the environment variables against option 45 in WRKJOB and trial and error I managed to find that setting the following two environment variables did the trick.

Using ADDENVVAR add the following two environment variables:


You can try different combinations of CCSID and locale depending on your desired character encoding. I set this up so we would be using UTF-8.

A list of locales can be found here. Note that en_US is either ISO8859-1 or 8859-15 whereas EN_US is UTF-8.


Web services with JAX-WS, JAXB and Spring

I’ve recently been working a lot with Java web services, most of these were greenfield projects where we were able to choose the architecture. I decided to use JAX-WS to create the web services but was unsure initially of the best way to go about this. In general there are two approaches to writing web services (contract first or code first).

Contract first
Contract first requires a wsdl to be written first and then JAX-WS can be used to generate matching code. This approach has its place if you already have a wsdl but they’re not the easiest of things to work with and maintenance quickly becomes messy.

Code first
Code first is much easier as you simply write the code using a handful of basic annotations and let JAX-WS generate the wsdl for you at run time. The downside to this approach is that you have to write the code before you have a wsdl or schema available. This may not be an issue but if you need to write a specification first then it’s handy to have some kind of schema to define how the inputs and outputs are going to look.

Also, in some cases you may be able to express more in a schema than you can with just Java code. For instance you might want to set a restriction in the schema (like a string max length), or perhaps you want a particular nested structure of elements. You could do this yourself with JAXB annotations but it’s easier to write a schema and generate the required classes.

Combined approach using JAXB
I asked this as a question on stackoverflow and one of the answers provided inspiration for a kind of best of both worlds approach. This has now been implemented for several different projects and overall it’s been a pleasure to work with. The basic idea is that development follow something like the following process.

  1. Write a basic schema that defines the request and response types (this can be included in specifications and is easier to maintain than a full wsdl)
  2. Use JAXB/XJC to generate the request and response types
  3. Write a JAX-WS endpoint using the generated types as inputs and outputs
  4. Let JAX-WS generate the full wsdl at runtime

Spring integration
In addition to this I wanted to use spring for managing services, dependency injection and loading properties files. This requires setting up JAX-WS slightly differently so that spring can load the endpoints and inject dependencies. If you don’t do this then you’ll end up with a JAX-WS version of the endpoint with none of its dependencies injected while spring will have its own instance complete with injected dependencies but not handling any requests.

Maven build and dependencies
I’m using maven to manage the build and dependencies. You don’t have to use maven but it really does make life much easier. These are the required dependencies you’ll need in your pom.xml file.

		<!-- JAXWS web services -->

      	<!-- Spring DI -->

        <!-- JAX-WS/Spring integration -->

In the build section of the pom you’ll also need to configure the JAXB plugin to auto-generate code from your schemas. The way it’s setup here it will look for any schema in the directory /src/main/resources/xsd and then generate code and put it in a source folder called /target/generated-sources/src/main/java. If you’re using eclipse you’ll want to right click this folder and take the option Build path > Use as source folder. If you don’t do this you’ll still be able to run a build using maven but you’ll probably see compile errors in eclipse.

            <!-- Generate JAXB Java source files from an XSD file -->
                    <!-- Don't set package name here as we want different packages for each schema.
                         Instead we set in each schema separately. -->
                    <!-- <packageName></packageName>  -->

Web app config files
Next we need to add some configuration files to setup our web application. First is the web.xml deployment descriptor. Here we define the standard spring listener to load our spring configuration and we also setup a servlet to listen for our web service requests. Rather than use the JAX-WS servlet we’ve used a spring wrapper which will later allow us to use dependency injection in our endpoint classes.

<?xml version="1.0" encoding="UTF-8"?>
<web-app version="2.5" xmlns="http://java.sun.com/xml/ns/javaee"
	xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd">


  <!-- Load spring configuration -->

  <!-- Servlet to handle all jax-ws requests -->


  <!-- There didn't ought to be any sessions created but it's good practice to define a
  timeout as it varies for different containers. -->


The spring config is minimal in this basic example. It’s just turning on classpath scanning for the package in our test project and enabling auto-wiring of scanned dependencies. We also need to setup which urls map to which endpoint classes but this has been moved to a separate file which we import at the bottom.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       " >

    <context:component-scan base-package="com.testservices"/>

 	<!-- Define the jaxws endpoint -->
 	<import resource="jaxws-context.xml"/>


Usually with JAX-WS you need a config file called sun-jaxws.xml where you define which urls map to which endpoints. In this case we’re using a spring JAX-WS servlet so instead we add our mappings here. This simple mapping is saying all web service requests to /service/test1 will go to the spring bean with an ID of test1Services. We shall see this bean shortly.

<?xml version="1.0" encoding="UTF-8"?>
<beans 	xmlns="http://www.springframework.org/schema/beans"

 	<!-- Define our jaxws endpoint (replaces sun-jaxws.xml) -->
    <wss:binding url="/service/test1">
            <ws:service bean="#test1Services" />


Schema to define request/response types
In this simple example we could get away with one simple schema but in a real world example you’ll probably end up with many. The way I’ve been organizing this is to have a base shared schema which defines common types. For example you might want all requests to include a username, password and environment and maybe the response should always have a boolean element to indicate success. Then you can write a schema for each endpoint defining the request and response types for all operations on that wsdl.

Here is the example base schema shared.xsd.

<?xml version="1.0" encoding="UTF-8"?>
<xs:schema 	xmlns:xs="http://www.w3.org/2001/XMLSchema"

  <!-- Settings for the JAXB code generation -->
      <!-- Set the package name for the generated classes -->
        <jaxb:package name="com.testservices.generated.shared" />

  <!-- Begin Types/Classes to be generated -->
  <xs:group name="baseRequest">
      <xs:element name="user" type="xs:string"/>
      <xs:element name="apikey" type="xs:string"/>

  <xs:group name="baseResponse">
      <xs:element name="success" type="xs:boolean"/>


This schema then imports the shared types and defines a simple request and response type for our example web service. Note that we can define which package the generated code belongs to by using the jaxb namespace extensions. There are a number of other customizations you can do like mapping xml types to Java types.

<?xml version="1.0" encoding="UTF-8"?>
<xs:schema 	xmlns:xs="http://www.w3.org/2001/XMLSchema"

  <!-- Settings for the JAXB code generation -->
      <!-- Set the package name for the generated classes -->
        <jaxb:package name="com.testservices.generated.test1" />

  <xs:import namespace="http://shared.testservices.com/" schemaLocation="shared.xsd" />

  <xs:complexType name="test1Request">
      <xs:group ref="sh:baseRequest"/>
      <xs:element name="id" type="xs:int" />

  <xs:complexType name="test1Response">
      <xs:group ref="sh:baseResponse"/>
      <xs:element name="field1" type="xs:string" />
      <xs:element name="field2" type="xs:string" />


Finally add the endpoint interface/class
Having saved those schemas there should now be a generated class called Test1Request and another called Test1Response. We’re now ready to start piecing things together. The final stage is to define the JAX-WS handler and add a web method that uses these classes as inputs and outputs.

public interface Test1
	Test1Response test1(Test1Request request);

So we have a very simple interface which we annotate with the JAX-WS @WebService annotation. There is one method which uses our generated classes. When this code is deployed JAX-WS will do the legwork and generate the WSDL with this one method on it. We just need to implement this interface now.

@WebService(endpointInterface = "com.testservices.endpoint.Test1")
public class Test1Impl implements Test1

	ObjectFactory	fact	= new ObjectFactory();

	public Test1Response test1(Test1Request request)
		System.out.println("User: " + request.getUser());
		System.out.println("ID: " + request.getId());

		Test1Response response = fact.createTest1Response();

		response.setField1("Value 1");
		response.setField2("Value 2");

		return response;


The implementation uses the JAXB object factory, which was generated for us, to create a response object. We then hardcode some values just to see if it works. In practice you’ll be able to inject a spring service here and go off and do whatever business logic is required. The class has been annotated as a spring component called test1Services. It’s important that this matches the bean name given in the spring config file for JAX-WS. The @WebService annotation names the interface that we’ve implemented.

You should now be able to fire up your test server and see this running. If you go to http://localhost:PORT/ProjectName/service/test1 you should see a page confirming the web service has been deployed with a link to the wsdl (just append ?wsdl).

Calling RPG on the AS400 from Java

RPG is the native language on the IBM as400 midrange server (aka iSeries, system i and now just “i”). In a recent project I had to find a way to call a number of RPG programs from a Java application. If you’re in this situation then there are a few options available.

  • PCML (the subject of this article)
  • SQL Stored procedure
  • Integrated web services server (IWS)

Stored procedure
One possibility is to write a SQL stored procedure using RPG which could be called from Java using JDBC. This option may not be viable depending on the parameters your program needs. A stored procedure is good for returning result sets of records but you can only return one and you can’t pass one in.

Integrated web services server (IWS)
If you want to quickly expose an RPG program as a web service then you might want to look at IWS. This is a quick way to get up and running but there are a number of limitations.

  • If you want multiple operations on the same wsdl you have to write them all as procedures in the same service program
  • If using a service program IWS only supports up to 7 parameters including both inputs and outputs
  • IWS only supports contract last development. In other words you have to write the code first to get the wsdl.
  • The generated wsdl has a number of duplicated elements which you have to manually remove to tidy the appearance.
  • If you change the parameters or operations you have to go through the whole wizard again on each machine you deploy to.
  • Arrays are fixed size in RPG so IWS always returns all elements, even if some are simply blanks.

The most flexible method is Program Call Markup Language (PCML). This is an API that IBM provided for just this scenario. PCML is an XML language for defining the parameter list for an RPG program. This can then be used from a Java application.

Generating the PCML
You could write the pcml file by hand but a better way is to get the RPG compiler to generate it for you. First lets write a simple RPG program that we want to call.

D CONVTEMP        PR                  Extpgm('CONVTEMP') 
D  iCelsius                      9  3 Const                   
D  oFahrenheit                   9  3                    
D CONVTEMP        PI                                     
D  iCelsius                      9  3 Const                   
D  oFahrenheit                   9  3                    
   // We could get an API to return whatever we like here
   oFahrenheit = ((iCelsius * 9) / 5) + 32;              

Yes it’s the cliche web service example to convert temperature from Celsius into Fahrenheit. It’s a good one to start with though because it is has both an input and an output but is still fairly simple. Note that the input parameter has been defined as a const, this is significant when we generate the PCML.

To generate the PCML from here you need to prompt compile and set the following options. Set PGMINFO to *PCML and INFOSTMF to a path on the ifs where you want your generated file to go e.g. /mylib/CONVTEMP.pcml. Doing so gives the following PCML.

<pcml version="4.0">
      <data name="ICELSIUS" type="packed" length="9" precision="3" usage="input" />
      <data name="OFAHRENHEIT" type="packed" length="9" precision="3" usage="inputoutput" />

We now have an XML file that describes how to call this program. Notice that the iCelsius parameter has been set to input but oFahrenheit is inputoutput. This is a result of setting iCelsius to a const parameter. When making a PCML call you must set a value for all input parameters. The default is inputoutput which can go both ways but is inconvenient if you don’t have an input value to set. Unfortunately there’s no language feature in RPG to set a parameter to output only so you have to adjust these manually.

      <data name="OFAHRENHEIT" type="packed" length="9" precision="3" usage="output" />                                            

Java dependencies
To make a PCML program call you just need the jt400 jar on your classpath. You can either use the IBM version that comes bundled with the AS400 or the open source JTOpen version.

If you use maven then you can simply declare it as a dependency like this.


This works fine but sadly the JTOpen developers stopped publishing to maven central at version 6.7 (current version is 7.10 at time of writing).

Calling the program
This is everything you need to make the program call. The Java class below is a simple test that opens a connection, calls the program and returns the result.

import java.math.BigDecimal;

import com.ibm.as400.access.AS400;
import com.ibm.as400.data.PcmlException;
import com.ibm.as400.data.ProgramCallDocument;

public class ConvertTemperature

	private AS400	as400;

	public ConvertTemperature()
		as400 = new AS400("SYSTEM", "USERNAME", "PASSWORD");

	public BigDecimal celsiusToFahrenheit(BigDecimal celsius)
		BigDecimal fahrenheit = null;

			ProgramCallDocument pcml = new ProgramCallDocument(as400, "CONVTEMP");

			pcml.setValue("CONVTEMP.ICELSIUS", celsius);
			boolean rc = pcml.callProgram("CONVTEMP");
				fahrenheit = (BigDecimal) pcml.getValue("CONVTEMP.OFAHRENHEIT");
		catch(PcmlException e)

		return fahrenheit;

	public static void main(String[] args)
		ConvertTemperature ct = new ConvertTemperature();
		ct.celsiusToFahrenheit(new BigDecimal(25.2));


The second parameter to the ProgramCallDocument constructor is the path on the classpath to the PCML xml document. I created a file called CONVTEMP.pcml and put it in the src/main/resources folder. To keep things simple this is the root classpath folder, the .pcml suffix is not required as it is implied.

The PCML API will automatically handle converting Java types to AS400 types and back again. In this example the packed decimal from the AS400 becomes a BigDecimal in Java.

This is obviously a basic example that works as a proof of concept but there are a few additions worth mentioning if you want to use this in a production environment.

Adding connection pooling
Each time you create an AS400 object you’re opening a physical connection to the AS400. Each new connection creates a new job on the AS400. It’s obviously a bit wasteful to then throw this away and start with a fresh connection on the next call. A much better solution is to create a connection pool.

First you need to create the connection pool object. This code should live in it’s own class so the pool can be shared by different parts of the application. You could also load a properties file from the classpath to set the connection pool properties.

AS400ConnectionPool pool = new AS400ConnectionPool();

Now each time you want a connection you simply ask the pool. If no connections exist then one will be created.

AS400 as400 = pool.getConnection("SYSTEM", "USERNAME", "PASSWORD");

Remember to always return the connection back to the pool once you’re finished with it. This should be done in the finally section of the try/catch block to ensure the connection is returned if an exception is thrown.


Setting a library list
The PCML file generated had a fixed path to a specific library. In practice you may find the program exists in different libraries and you want to use the one at the top of the library list. To do this we must first change the PCML file to not hardcode the library.

Change this:


To this:

<program name="CONVTEMP" path="/QSYS.LIB/%LIBL%.LIB/CONVTEMP.PGM">

Next we need an event listener that will set the library list when a new connection is opened.

import com.ibm.as400.access.AS400;
import com.ibm.as400.access.AS400Message;
import com.ibm.as400.access.CommandCall;
import com.ibm.as400.access.ConnectionListener;
import com.ibm.as400.access.ConnectionPoolEvent;
import com.ibm.as400.access.ConnectionPoolListener;

public class AS400ConnectionPoolListener implements ConnectionPoolListener
    public void connectionCreated(ConnectionPoolEvent event)
		AS400 as400 = (AS400) event.getSource();
		CommandCall command = new CommandCall(as400);
            String liblCommand = "CHGLIBL(QTEMP PCMLTEST QGPL)";
			if(command.run(liblCommand) != true)
				// Show the messages (returned whether or not there was an
				// error.)
				AS400Message[] messagelist = command.getMessageList();
				for(int count = 0; count < messagelist.length; count++)
					// Show each message.
					System.out.println("System message: " + messagelist[count].getText());

	public void connectionExpired(ConnectionPoolEvent event)
		// Not currently overriden

	public void connectionPoolClosed(ConnectionPoolEvent event)
		// Not currently overriden

	public void connectionReleased(ConnectionPoolEvent event)
		// Not currently overriden

	public void connectionReturned(ConnectionPoolEvent event)
		// Not currently overriden

	public void maintenanceThreadRun(ConnectionPoolEvent event)
		// Not currently overriden

Finally the event listener needs to be registered as an observer of the connection pool.

pool.addConnectionPoolListener(new AS400ConnectionPoolListener());

This is a basic example that shows how to call an RPG program from Java. To brush this up a bit for production you only really need a few classes to wrap the connection pool and loading of properties. This would allow you to set the library list on different servers with a simple properties file. I would use spring to load the properties and register a bean that holds the connection pool. If you have to support multiple environments then you could set the library list each time you get a connection. Alternatively it might be more efficient to pass the environment to the RPG program and handle it on the AS400.

A first knockout custom binding to display twitter bootstrap alerts

Just had a go at my first knockout custom binding. I wanted to be able to bind an observableArray to a div and for that div to display as a twitter bootstrap alert with each array item on a separate line.

If you want to get TB working with knockout quickly then you may want to check out this library.
I decided not to use it for my alerts because I wanted my errors to be in a single dismissable alert. It would be a pain for the user to have to dismiss all the alerts separately. Also, I’m trying to learn knockout so writing my own is all part of the fun. 🙂

So with that out of the way this is what I came up with.
First in my html I have this very simple markup.

<div data-bind="alertList: errors, priority: 'error'"></div>

I’m using my new binding called alertList to bind an array to this div. The priority option will allow me to choose an error, warning, info or success alert.

This is what the JSON looks like coming back from the server. The main things to note are that it’s an array of errors and each error has a message attribute.

[{"message":"Password may not be null","messageTemplate":"{javax.validation.constraints.NotNull.message}","path":"User.createUser.arg0.password","invalidValue":null},
{"message":"Language may not be null","messageTemplate":"{javax.validation.constraints.NotNull.message}","path":"User.createUser.arg0.supportedLanguage","invalidValue":null},
{"message":"Username may not be null","messageTemplate":"{javax.validation.constraints.NotNull.message}","path":"User.createUser.arg0.username","invalidValue":null}]

In the view model we’ll typically have an ajax request to get a rest resource. If that results in an error then we can populate the observable errors array with the response from the server. The custom binding will then magically render these in the div with the correct styling. This is a stripped down example just to give the idea.

define(["jquery", "knockout", "mapping"], function($, ko, mapping)
"use strict";
var FormVM = function FormVM()
  var self = this;
  self.errors = ko.observableArray();
  self.submit = function()
    $.ajax( {
      url: "test",
      type: "post",
      error: function(data)
    } );
return FormVM;

Finally we just need the custom binding to glue it all together.

define(["jquery", "knockout"], function($, ko)
"use strict";
* Binds an observable array to a dismissable alert box
ko.bindingHandlers.alertList =
init: function(element, valueAccessor, allBindingsAccessor, viewModel, bindingContext)
  var data = valueAccessor();
  var priority = allBindingsAccessor().priority;

  var alertClass = "alert-danger";
  if(priority === "info")
    alertClass = "alert-info";
  }else if(priority === "warning")
    alertClass = "alert-warning";
  }else if(priority === "success")
    alertClass = "alert-success";

  element.style.display = "none";
  element.innerHTML = "<button type=\"button\" data-hide=\"alert\" class=\"close\">x</button>" +
  "<h4 class=\"alert-heading\">Error</h4>";
  element.className = "alert " + alertClass + " alert-dismissable";

  var ul = document.createElement("ul");
  ul.className = "list-unstyled";
  ul.innerHTML = "<li><span data-bind='text: message'></span></li>";

  ko.applyBindingsToNode(ul, { foreach: data });
  return { controlsDescendantBindings: true };

update: function(element, valueAccessor, allBindingsAccessor, viewModel, bindingContext)
  var value = ko.utils.unwrapObservable(valueAccessor());
  } else {

The init function does all of the initial construction. We take the basic div and add in the bootstrap classes we need. We also add a ul element which we use to create a list of errors from the message property of the array items. We also check the priority option to determine the class used to display the alert (e.g. error or warning etc).

The update function just adds in some jQuery animations to make the alert slideDown when you give it some elements and then fade out if you remove the elements.

I also added the following bit of jQuery to make the dismiss button work. The default behavior for a bootstrap dismissable alert is to remove the element from the page but we want to only hide it so we can later display again if we add more errors to the observable. We add a click event to any element with the data-hide attribute to fade out when clicked.

$(document).on("click", "[data-hide]", function()
	$(this).closest("." + $(this).attr("data-hide")).fadeOut("fast");

Alert Binding

Source is also available as a gist.

Wild card tiles defs with spring MVC

A while ago I wrote about using Spring MVC to automatically scan for xml tiles definitions. This works beautifully but I later realised that you could go one better and do away with individual definitions in favour of wildcards. Essentially implementing your own convention over configuration.

The spring setup is the same as in the previous article. You need to have:

  • A tiles view resolver
  • A spring bean of type TilesConfigurer with completeAutoload set to true
  • You need to include tiles 2.2 or greater and tiles-extras.jar

This is explained in greater detail in the previous post so here’s the link again if you need more detail.
Auto scanning tiles defs with spring MVC

Setting up wildcards definitions
First you need to decide on your folders structure. I’ve put everything under WEB-INF/jsp. Under this folder I have one folder called template which holds all the jsp fragments for headers and footers and the few xml tiles defs. The tiles definitions will still be automatically scanned, there will just be fewer of them. The second folder is called views which contains the actual pages with body content.

I’m using a base definition which I plan to override for different sections of the site. Each section can then have it’s own sidebar coming from a different jsp fragment. For example the admin section of the site will override the base definition with an admin specific sidebar. All admin related pages should use this admin definition.

Here’s the tiles definitions file with a simple base def and the overridden admin def.

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE tiles-definitions PUBLIC
       "-//Apache Software Foundation//DTD Tiles Configuration 2.0//EN"

    <definition name="base.definition" template="/WEB-INF/jsp/template/layout.jsp">
        <put-attribute name="title" value="" />
        <put-attribute name="header" value="/WEB-INF/jsp/template/header.jsp" />
        <put-attribute name="sidebar" value="/WEB-INF/jsp/template/sidebars/sidebar.jsp" />
        <put-attribute name="body" value="" />
        <put-attribute name="footer" value="/WEB-INF/jsp/template/footer.jsp" />
    <definition name="admin.definition" extends="base.definition">
        <put-attribute name="sidebar" value="/WEB-INF/jsp/template/sidebars/admin-sidebar.jsp" />
    <definition name="WILDCARD:admin/*" extends="admin.definition">
       <put-attribute name="title" value=" - {1}" />
       <put-attribute name="body" value="/WEB-INF/jsp/views/admin/{1}.jsp" />

We’re using the extends attribute to inherit as much as possible from the base definition. The admin base definition just overrides the sidebar location. We could if we wanted use this as a base for defining individual tiles views. Instead we have a definition with a name of “WILDCARD:admin/*”. This means any tiles view requested where the name starts admin/* will use this definition. The rest of the tile name is used to replace the {1} replacement variables.

So for example, if your controller returns a string of “admin/user” spring will look for a tile with this name. It should find the wildcard def and insert the title as “user” and the body jsp as /WEB-INF/jsp/views/admin/user.jsp.

There’s a bit more info in the tiles documentation.