Logging exceptions in Mule

The simplest way to log exception thrown within a mule flow is to use the mule expression langauge with the Logger component

<logger level="ERROR" doc:name="Logger" message="#[exception.causeException]"/>
 

However, this only logs the message from the root cause. Sometimes, I need to log the full stack trace in debug. (Or at error level if it’s only developers reading the application log). To do this, mule provides a class called ExceptionUtils. For example,

<logger level="ERROR" doc:name="Logger" message="#[org.mule.util.ExceptionUtils.getFullStackTrace(exception)]"/>

Classes implementing the Callable interface can also access any exception thrown using the exception payload. I had to do this in an application I was working on because I needed to save any error messages in the database request buffer for audit purposes.

@Component("UpdateDB")
public class UpdateDB implements Callable {
	@Override
	public Object onCall(MuleEventContext muleContext) throws Exception {
		ExceptionPayload exceptionPayload = muleContext.getMessage().getExceptionPayload();
		String errorMessage = exceptionPayload.getRootException().getMessage();
		return null;
	}
}

Accessing mule flow variables from Java components

In one of the earlier steps of a mule flow, I extracted some data from the payload, and stored them in flow variables. This was done to save information like the database primary key which I would need to update the status buffer later, before transforming the payload to the required response message.

<flow name="someflow">
  <inbound-endpoint ref="jdbcEndpoint" doc:name="Generic" />
  ..
  <set-variable variableName="requestId" value="#[message.payload.requestId]"/>
  ..
</flow>

Getting to these flow variables from within a Java component turned out to be a lot harder I had anticipated.

My first attempt was to use the @Mule annotation. I annotated my Java method as follows

    public void process(@Payload AmazingFilePayload payload,  
                        @Mule("flowVars['requestId']") Integer requestId) {
        // do stuff
    }

The MEL was valid because I could access the flow variable within the mule flow with

<logger level="DEBUG" message="#[flowVars['requestId']]"/>

However, the above Java gave a StringIndexOutOfBoundsException with the message String index out of range: -1. Looking through the documentation, I couldn’t see how you access flow variables at all with Java annotations.

In the end, I resorted to implementing the Callable interface. It seemed an unsatisfactory work around to me, because

  1. the Java component was no longer a POJO
  2. I needed a different class for each update method, instead of writing a single class with many related methods
public class UpdateBuffer implements Callable {
   @Override
	public Object onCall(MuleEventContext muleContext) throws Exception {
		Integer requestId = (Integer) muleContext.getMessage().getProperty("requestId", PropertyScope.INVOCATION);
		Integer requestId2 = (Integer) muleContext.getMessage().getInvocationProperty("requestId");  // alternative
		return null;
	}
}

Spring dependency injection for Struts Actions

Being used to Spring MVC, I was surprised when I discovered Struts did not use the Action bean I created in the Spring config file when handling web requests. Basically I needed a DAO wired into an existing Struts Action. I created a bean with the appropriate setter for the Struts Action in the Spring config file. However, I got a nasty null pointer exception because the setter was never called.

This should have been obvious if I have given it a thought. The Struts Actions in the web app are managed by Struts, not Spring. To get Spring to perform dependency injection on Struts Actions, you need to use the DelegatingActionProxy.

In the Struts config file

<action path="/store/order" type="org.springframework.web.struts.DelegatingActionProxy" name="orderForm" validate="false">
	<forward name="success" path="/jsp/view.jsp" />
</action>

In the Spring config file

<bean name="/store/order" class="com.whileloop.web.action.OrderAction">
    <property name="basketDao" ref="basketDao"/>
</bean>

Log4j rolling file appenders in Windows

I’ve been using the DailyRollingFileAppender in log4j for years without any problems. It came as a surprise when my trusted appender failed to rollover in a new web service. A bit of googling made me realised it is a widespread problem. The only reason I haven’t encountered this problem before was because I have exclusively developed for Linux. And now my new work is a Windows shop.

Essentially, the log4j DailyRollingFileAppender renames the day’s log file at the end of the day. This runs into file contention problems in Windows, and the renaming regularly fails. A very simple solution to this is to create your log file with the date prefix already in place, and thus avoid renaming it entirely. This is the solution taken by Geoff Mottram on the DatedFileAppender he released to the public domain back in 2005. (This is the appender I found configured for some of the web services deployed on the company’s mule server).

The log4j crew also recognised this problem, and according to its bug tracker, the problem has been fixed for 1.3. But since the 1.3 series have been abandoned, the patch is now available as part of Log4j Extras.

Using the new log4j rolling file appender

To include log4j extras using maven

	<dependency>
		<groupId>log4j</groupId>
		<artifactId>apache-log4j-extras</artifactId>
		<version>1.2.17</version>
	</dependency>    

A sample log4j.xml

	<appender name="FILE" class="org.apache.log4j.rolling.RollingFileAppender">
		<rollingPolicy class="org.apache.log4j.rolling.TimeBasedRollingPolicy">
	      		<param name="FileNamePattern" value="D:/logs/app-%d.log.gz"/>
	   	</rollingPolicy>
		<layout class="org.apache.log4j.PatternLayout">
			<param name="ConversionPattern" value="%d{ABSOLUTE} %-5p [%c{1}] %m%n" />
		</layout>
	</appender>

How often the log file rolls is specified by the date format in the FileNamePattern. It uses the same formatter as Java’s SimpleDateFormat. By default, (%d in app-%d.log), a new log file is created daily. To create a new log file every minute, use something like app-%d{yyyy-MM-dd-HH-mm}.log. The gz suffix in app-%d.log.gz means old log files will be gzipped automatically.

More hamcrest collections goodness

I have been using Hamcrest more in my unit tests. JUnit 4.11 included only a portion of the matchers available in Hamcrest 1.3. (The ones packaged in hamcrest-core specifically). To include other useful matchers from Hamcrest, add the following to the maven pom.xml

		<dependency>
			<groupId>org.hamcrest</groupId>
			<artifactId>hamcrest-library</artifactId>
			<version>1.3</version>
			<scope>test</scope>
		</dependency>

I found the collections one very handy. For example, to test the size of a list:

import static org.hamcrest.collection.IsCollectionWithSize.hasSize;
import static org.junit.Assert.assertThat;
...
List list = new ArrayList();
assertThat(list, hasSize(0));

JUnit 4.11 and its new Matchers

I have never used the Hamcrest matchers with JUnit before. Not until last week. I noticed in the release note that JUnit 4.11 included Hamcrest 1.3, with its Matchers and improved assertThat syntax. Reading the examples on the release note, I was intrigued.

To use the new Matchers and assertThat, you need to include the following imports

import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
Number Objects

The first improvement I noticed were comparison with Java number objects.

Long l = new Long(10);
assertEquals(10L, l);
assertThat(l, is(10L));

With the old assertEquals, the compiler would complain about The method assertEquals(Object, Object) is ambiguous for type X. You need to change both parameters to either long values or Long objects for the assert to work, for example

assertEquals(10L, l.longValue());

On the other hand, assertThat and the is() matcher works just fine.

Collections

I saw a few very handy looking matchers for Collections from looking at the CoreMatchers javadoc. For example, hasItem, hasItems, everyItem. I had the opportunity to use hasItems in my unit tests last week to check if a List object contains items from a given list of values. It was as simple to use as this

assertThat(list, hasItems("apples", "oranges"));

I’m a fan of this new way of matching things in JUnit.

Mule Studio and Maven Profiles

The maven project I’m working on has profiles for different environments, such as testing, development and deployment.

<profiles>
	<profile>
		<id>test</id>
		<activation>
			<activeByDefault>true</activeByDefault>
		</activation>
		<properties>
			<db.host>testdb.mycompany.com</db.host>
			<db.name>projectx</db.name>
		</properties>
	</profile>

	<profile>
		<id>development</id>
		<properties>
			<db.host>127.0.0.1</db.host>
		</properties>
	</profile>
</profiles>

To activate multiple profiles at run time, you use the command line option -P
mvn test -P test,development

Or inside eclipse with m2e, you can configure a list of active profiles under Run Configurations.

However, with Mule Studio, if you run the project as a Mule Application with Maven, there are no options to select maven profiles.

The way to get around this is to edit the maven profiles to be activated by a property or a file. In my case, I updated my pom.xml to

<profiles>
	<profile>
		<id>Test</id>
		<activation>
			<activeByDefault>true</activeByDefault>
			<property>
				<name>env</name>
				<value>test</value>
			</property>
		</activation>
		<properties>
			<db.host>testdb.mycompany.com</db.host>
			<db.name>projectx</db.name>
		</properties>
	</profile>

	<profile>
		<id>Development</id>
		<activation>
			<file>
				<exists>.git</exists>
			</file>
		</activation>
		<properties>
			<db.host>127.0.0.1</db.host>
		</properties>
	</profile>
</profiles>

The test profile is activated by setting the system property env to test. This is done in Mule Studio under Windows -> Preferences -> Mule Studio -> Maven Settings. In the “MAVEN_OPTS environment variable” text box, add -Denv=test. The development profile is activated by the existence of a .git file in the project root. Now when I run this as a mule+maven project in eclipse, the properties from both of these profiles are available.

You might ask wouldn’t it be easier to just add -P test,development to the MAVEN_OPTS text box? Yes it would definitely be, but mule studio complained about -P being an unrecognized option.

PS. I’m using mule studio 3.5.