Installing Anypoint Studio Plug-in 5.3.0 in Eclipse Mars

Installing anypoint studio should be a straight forward process, according to the official documentation on the mulesoft website. You add mulesoft to the list of available software site, select anypoint studio, click finish. Eclipse install software wizard should, theoretically, pull in required dependencies.

However, it didn’t work. I got the following exception

No repository found containing: osgi.bundle,org.eclipse.m2e.archetype.common,

I had a Java EE installation of Eclipse Mars (4.5). It already included maven support via m2e-wtp. Anypoint studio wasn’t happy with this version of m2e, and was also unable to pull in the required version of m2e itself. I had to manually add m2e’s update site and install m2e. After that, anypoint studio was installed without a problem.

Unit test Audit Logging Requirements with Mockito

During development of server-side software, I routinely encountered requirements to audit logging. This ranged from logging routine events such as system startup and shutdown, to information when errors occur. For example, a component I worked on was responsible for generating binary data structures and forwarding them to a distribution component. The data generator must limit the size of the structures to protect end devices from data packets that were too large to be processed within their resource constraints. The requirements stipulated that data attributes should be placed into the structure in ascending order, and an error message shall be logged in the audit log for any attributes that were not included in the structure. I have added verification on audit logging in the unit tests that test the handling of attribute omission. Since unit tests have been implemented to check the error handling already, it was not much more effort to include checks on the logging for the same conditions.

A naïve approach to unit test logging is to create the error within a unit test, write the log to a file, read the log file from disk, and then search the file for the expected error message. However, there is a simpler and more elegant way to assert logging behavior by using mock objects. By adding a mock Appender to the target Logger, all log requests to the target Logger are now also forwarded to the mock Appender. Logging events can then be verified directly with the mock Appender.

I use JUnit and Mockito for my unit testing. The following code snippet sets up a mock Appender for use in the unit test class.

@Mock private Appender appender;
private static final Logger AUDIT = Logger.getLogger("");

public void setUp() {

public void tearDown() {

To assert that a log message is written during a unit test, I use Mockito’s verify on the mock Appender’s doAppend method:

verify(appender).doAppend((LoggingEvent) anyObject());

Obviously, this only checks for the existence of a log message when the unit test is run. I normally run our unit tests with logging at trace level to make sure logging calls at the lowest level are exercised. If a debug message is written during the unit test, the above verification will pass even when no error messages were written.

Mockito’s ArgumentCaptor can be used to check for a specific message at a specific logging level. The following code captures all the LoggingEvents that occur in a unit test. It then iterates over the list of captured events, checking their logging level and message content. If one of the log messages matches the expected level and keyword, then the test will pass the assert statement after the loop. Searching for an exact match in the message could lead to brittle tests. They will break if the log message is updated. Therefore I prefer searching for keywords like ‘error’, ‘exceeds’, ‘dropped’ instead.

verify(appender, atLeastOnce()).doAppend(argumentCaptor.capture());
List<LoggingEvent> loggingEvents = argumentCaptor.getAllValues();
for (LoggingEvent le : loggingEvents) {
  if (le.getLevel().equals(ERROR) &&
      le.getMessage().toString().contains(keyword) {
      matched = true;
assertTrue("Cannot find error message [" + keyword + "] in audit log", matched);

Mock objects provide an easy way to test logging in unit tests. To do this I exploit the fact that a mock object remembers all its interactions. Logging can be verified by looking at interactions with the mock Appender’s doAppend method, instead of reading and parsing the log file on disk. By including logging in unit tests, I can verify that audit logging requirements are fulfilled. It also guards against future changes to the code from inadvertently breaking the logging requirement compliance. Problems with audit logging caused by refactoring can be caught during development automatically, removing the chance that the failure could reach testers or even clients.

This post was originally written and published in August 2013 for a newsletter.

Verify nulls with Mockito argument matchers

When using verify in Mockito, all the function arguments must either be exact values or argument matchers. You cannot mix and match the two within the same verify. In other words, the following two statements are correct,

verify(mockFtpClient).rename("a.txt", "b.txt");
verify(mockFtpClient).rename(eq("a.txt"), eq("b.txt"));

But this is not

verify(mockFtpClient).rename(eq("a.txt"), "b.txt");

Today, I needed to match an anyString() and a null in the same verify statement. Mockito’s documentation didn’t have an example on how to use an argument matcher for null string. It turns out there is a hamcrest style isNull matcher in Mockito:

verify(mockFtpClient).rename(anyString(), org.mockito.Matchers.isNull(String.class));

(The above example makes no sense semantically because you’d never want to pass a null string to the FtpClient’s rename function).

Logging exceptions in Mule

The simplest way to log exception thrown within a mule flow is to use the mule expression langauge with the Logger component

<logger level="ERROR" doc:name="Logger" message="#[exception.causeException]"/>

However, this only logs the message from the root cause. Sometimes, I need to log the full stack trace in debug. (Or at error level if it’s only developers reading the application log). To do this, mule provides a class called ExceptionUtils. For example,

<logger level="ERROR" doc:name="Logger" message="#[org.mule.util.ExceptionUtils.getFullStackTrace(exception)]"/>

Classes implementing the Callable interface can also access any exception thrown using the exception payload. I had to do this in an application I was working on because I needed to save any error messages in the database request buffer for audit purposes.

public class UpdateDB implements Callable {
	public Object onCall(MuleEventContext muleContext) throws Exception {
		ExceptionPayload exceptionPayload = muleContext.getMessage().getExceptionPayload();
		String errorMessage = exceptionPayload.getRootException().getMessage();
		return null;

Accessing mule flow variables from Java components

In one of the earlier steps of a mule flow, I extracted some data from the payload, and stored them in flow variables. This was done to save information like the database primary key which I would need to update the status buffer later, before transforming the payload to the required response message.

<flow name="someflow">
  <inbound-endpoint ref="jdbcEndpoint" doc:name="Generic" />
  <set-variable variableName="requestId" value="#[message.payload.requestId]"/>

Getting to these flow variables from within a Java component turned out to be a lot harder I had anticipated.

My first attempt was to use the @Mule annotation. I annotated my Java method as follows

    public void process(@Payload AmazingFilePayload payload,  
                        @Mule("flowVars['requestId']") Integer requestId) {
        // do stuff

The MEL was valid because I could access the flow variable within the mule flow with

<logger level="DEBUG" message="#[flowVars['requestId']]"/>

However, the above Java gave a StringIndexOutOfBoundsException with the message String index out of range: -1. Looking through the documentation, I couldn’t see how you access flow variables at all with Java annotations.

In the end, I resorted to implementing the Callable interface. It seemed an unsatisfactory work around to me, because

  1. the Java component was no longer a POJO
  2. I needed a different class for each update method, instead of writing a single class with many related methods
public class UpdateBuffer implements Callable {
	public Object onCall(MuleEventContext muleContext) throws Exception {
		Integer requestId = (Integer) muleContext.getMessage().getProperty("requestId", PropertyScope.INVOCATION);
		Integer requestId2 = (Integer) muleContext.getMessage().getInvocationProperty("requestId");  // alternative
		return null;

Spring dependency injection for Struts Actions

Being used to Spring MVC, I was surprised when I discovered Struts did not use the Action bean I created in the Spring config file when handling web requests. Basically I needed a DAO wired into an existing Struts Action. I created a bean with the appropriate setter for the Struts Action in the Spring config file. However, I got a nasty null pointer exception because the setter was never called.

This should have been obvious if I have given it a thought. The Struts Actions in the web app are managed by Struts, not Spring. To get Spring to perform dependency injection on Struts Actions, you need to use the DelegatingActionProxy.

In the Struts config file

<action path="/store/order" type="org.springframework.web.struts.DelegatingActionProxy" name="orderForm" validate="false">
	<forward name="success" path="/jsp/view.jsp" />

In the Spring config file

<bean name="/store/order" class="com.whileloop.web.action.OrderAction">
    <property name="basketDao" ref="basketDao"/>

Log4j rolling file appenders in Windows

I’ve been using the DailyRollingFileAppender in log4j for years without any problems. It came as a surprise when my trusted appender failed to rollover in a new web service. A bit of googling made me realised it is a widespread problem. The only reason I haven’t encountered this problem before was because I have exclusively developed for Linux. And now my new work is a Windows shop.

Essentially, the log4j DailyRollingFileAppender renames the day’s log file at the end of the day. This runs into file contention problems in Windows, and the renaming regularly fails. A very simple solution to this is to create your log file with the date prefix already in place, and thus avoid renaming it entirely. This is the solution taken by Geoff Mottram on the DatedFileAppender he released to the public domain back in 2005. (This is the appender I found configured for some of the web services deployed on the company’s mule server).

The log4j crew also recognised this problem, and according to its bug tracker, the problem has been fixed for 1.3. But since the 1.3 series have been abandoned, the patch is now available as part of Log4j Extras.

Using the new log4j rolling file appender

To include log4j extras using maven


A sample log4j.xml

	<appender name="FILE" class="org.apache.log4j.rolling.RollingFileAppender">
		<rollingPolicy class="org.apache.log4j.rolling.TimeBasedRollingPolicy">
	      		<param name="FileNamePattern" value="D:/logs/app-%d.log.gz"/>
		<layout class="org.apache.log4j.PatternLayout">
			<param name="ConversionPattern" value="%d{ABSOLUTE} %-5p [%c{1}] %m%n" />

How often the log file rolls is specified by the date format in the FileNamePattern. It uses the same formatter as Java’s SimpleDateFormat. By default, (%d in app-%d.log), a new log file is created daily. To create a new log file every minute, use something like app-%d{yyyy-MM-dd-HH-mm}.log. The gz suffix in app-%d.log.gz means old log files will be gzipped automatically.

More hamcrest collections goodness

I have been using Hamcrest more in my unit tests. JUnit 4.11 included only a portion of the matchers available in Hamcrest 1.3. (The ones packaged in hamcrest-core specifically). To include other useful matchers from Hamcrest, add the following to the maven pom.xml


I found the collections one very handy. For example, to test the size of a list:

import static org.hamcrest.collection.IsCollectionWithSize.hasSize;
import static org.junit.Assert.assertThat;
List list = new ArrayList();
assertThat(list, hasSize(0));

JUnit 4.11 and its new Matchers

I have never used the Hamcrest matchers with JUnit before. Not until last week. I noticed in the release note that JUnit 4.11 included Hamcrest 1.3, with its Matchers and improved assertThat syntax. Reading the examples on the release note, I was intrigued.

To use the new Matchers and assertThat, you need to include the following imports

import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
Number Objects

The first improvement I noticed were comparison with Java number objects.

Long l = new Long(10);
assertEquals(10L, l);
assertThat(l, is(10L));

With the old assertEquals, the compiler would complain about The method assertEquals(Object, Object) is ambiguous for type X. You need to change both parameters to either long values or Long objects for the assert to work, for example

assertEquals(10L, l.longValue());

On the other hand, assertThat and the is() matcher works just fine.


I saw a few very handy looking matchers for Collections from looking at the CoreMatchers javadoc. For example, hasItem, hasItems, everyItem. I had the opportunity to use hasItems in my unit tests last week to check if a List object contains items from a given list of values. It was as simple to use as this

assertThat(list, hasItems("apples", "oranges"));

I’m a fan of this new way of matching things in JUnit.

Mule Studio and Maven Profiles

The maven project I’m working on has profiles for different environments, such as testing, development and deployment.



To activate multiple profiles at run time, you use the command line option -P
mvn test -P test,development

Or inside eclipse with m2e, you can configure a list of active profiles under Run Configurations.

However, with Mule Studio, if you run the project as a Mule Application with Maven, there are no options to select maven profiles.

The way to get around this is to edit the maven profiles to be activated by a property or a file. In my case, I updated my pom.xml to



The test profile is activated by setting the system property env to test. This is done in Mule Studio under Windows -> Preferences -> Mule Studio -> Maven Settings. In the “MAVEN_OPTS environment variable” text box, add -Denv=test. The development profile is activated by the existence of a .git file in the project root. Now when I run this as a mule+maven project in eclipse, the properties from both of these profiles are available.

You might ask wouldn’t it be easier to just add -P test,development to the MAVEN_OPTS text box? Yes it would definitely be, but mule studio complained about -P being an unrecognized option.

PS. I’m using mule studio 3.5.