While loop in Mule ESB, or not

It’s a common use case to need an automated job that run once a day, executing a list of queued requests received throughout the day. However, there is no direct support for while loops in mule. Google leads me to this article in dzone. It implements a while loop using subflows and recursion.

In this article I put this implementation of while loop in mule to the test.
Continue reading While loop in Mule ESB, or not

Logging outbound HTTP requests from JAX-RS client

In order to track down a bug, I needed to log HTTP requests sent from one of our web services to another third party web service. (We hosted the service, but the software was not developed in house).

Our web service was written in resteasy, a framework I was not especially familiar with. (I prefer to use the Spring stack, and always create new web services using Spring Boot). The code to call the third party web service looked like this

import javax.ws.rs.client.Invocation.Builder;
builder.buildPost(Entity.form(form)).invoke();

Surprisingly, there wasn’t an obvious way to get the request body sent. From various stackoverflow Q&A, the way to log JAX-RS outbound client requests was to create an implementation of ClientRequestFilter, and register it as a Provider in the container.

@Provider
public class MyClientRequestLoggingFilter implements ClientRequestFilter {
  private static final Logger LOG = LoggerFactory.getLogger(MyClientRequestLoggingFilter.class);	
    @Override
    public void filter(ClientRequestContext requestContext) throws IOException {
      LOG.info(requestContext.getEntity().toString());
    }
}

You then configure your web.xml to scan for providers

<context-param>
  <param-name>resteasy.scan.providers</param-name>
  <param-value>true</param-value>
</context-param>

There are quite a few warnings that because the function ClientRequestContext.getEntity() returns an Object, the default toString() may not work as expected. Unmarshalling of the object is required to log the request body.

After banging my head against a wall for an afternoon, I decided to take a completely different approach to the problem. I googled on how to enable request logging in apache httpd instead. This turned out to be a much more straightforward way to achieve what I needed. The module mod_dumpio can used to dump all input and output requests to the server into a log file. You need mod_dumpio present in the apache httpd installation. (In windows, check to see if mod_dumpio.so is in c:\apache-install-dir\modules). Stop the service, edit the httpd.conf file to include the following lines

LoadModule dumpio_module modules/mod_dumpio.so

ErrorLog "logs/error.log"
LogLevel debug
DumpIOInput On
DumpIOOutput On
LogLevel dumpio:trace7

The ErrorLog and LogLevel lines are already present in my httpd.conf. I changed the LogLevel to debug, and added the follwoing three lines to turn on the dumpio module. After server restart, all HTTP requests and responses were successfully logged to the file logs/error.log.

Lesson learnt here – if an approach turned out to be more complicated than expected, it’s worth taking a step back and rethink.

Using two datasources in a Spring Boot application

Using one datasource only in a Spring Boot application is very straight forward. However, using multiple datasources in an application is anything but! It took me quite a bit of googling and fiddling to find a solution that worked.

To use two datasources, you need to set one up as primary. The second datasource will then become the secondary. You set a datasource as the primary by using the primary attribute. Below is an example using XML based configuration

<bean id="greenDataSource" primary="true" class="org.apache.commons.dbcp2.BasicDataSource">
    <property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver"/>
    <property name="url" value="${db.green.url}"/>
    <property name="username" value="${db.green.username}"/>
    <property name="password" value="${db.green.password}"/>
</bean>

Then define the secondary datasource like this:

   
<bean id="purpleDataSource" class="org.apache.commons.dbcp2.BasicDataSource">
    <property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver"/>
    <property name="url" value="${db.purple.url}"/>
    <property name="username" value="${db.purple.username}"/>
    <property name="password" value="${db.purple.password}"/>
</bean>

You can then wire them into your Java classes using the @Autowired and @Primary annotations:

@Repository
public class AwesomeDaoImpl implements AwesomeDao {
    private JdbcTemplate greenJdbcTemplate;
    private JdbcTemplate purpleJdbcTemplate;

    @Autowired
    @Primary
    public void setGreenDataSource(DataSource greenDataSource) {
        this.greenJdbcTemplate = new JdbcTemplate(greenDataSource);
    }

    @Autowired
    public void setPurpleDataSource(DataSource purpleDataSource) {
        this.ipdcJdbcTemplate = new JdbcTemplate(purpleDataSource);
    }
}

I haven’t figured out how to plumb in more than two datasources without using JNDI. If JNDI is available, then your Spring Boot application can access all the JDNI datasources using the @Resource annotation.

@Repository
public class ColourDaoImpl implements ColourErrorDao {
    private JdbcTemplate jdbcTemplate;

    @Resource(mappedName = "java:jboss/datasources/Green")
    public void setGreenDataSource(DataSource greenDataSource) {
        this.jdbcTemplate = new JdbcTemplate(greenDataSource);
    }
}

Dropping unknown default constraints in SQL server

For releases, we had to provide SQL scripts to install database changes. This had to be done outside of SQL server management studio and run on multiple environments. Occasionally, I had to drop unnamed constraints in the script. (Most developers wrote their install scripts with named constraints to avoid the difficulty). Every time I encountered this problem, I googled and followed this excellent blog post by Rob Farley. It gave the SQL commands to query for the name of an unnamed constraint, given the table and column name. It stopped short of giving you the SQL code to actually drop the constraint.

So here is my take:

declare @df_name varchar(max)
select @df_name = d.name from sys.tables t
    join sys.default_constraints d
        on d.parent_object_id = t.object_id
    join sys.columns c
        on c.object_id = t.object_id
        and c.column_id = d.parent_column_id
    where t.name = 'some_db_table'
    and c.name = 'some_column_in_table'

if @df_name is not null 
begin
	declare @sql varchar(max) = 'alter table some_db_table ' + ' drop constraint ' + @df_name 
	exec (@sql)
end

Java 8 Date-Time API and good old java.util.Date

Am I the only one who prefer Joda Time over the new Java 8 java.time package? I find the official Oracle documentation poor, and the API is not as intuitive.

No matter which high level datetime library is used in an application, be it java.util.Calendar, Joda Time or java.time, developers still often have to work with old fashion java.util.Date. This is because java.sql.Date is a subclass of java.util.Date and therefore most, if not all, data access layer code expects or returns java.util.Date.

To convert a datetime such as 2016-11-21 09:00 to java.util.Date is very simple in Joda Time.

// from Joda to Date
DateTime dt = new DateTime();
Date jdkDate = dt.toDate();

// from Date to Joda
dt = new DateTime(jdkDate);

Java 8 java.time has two separate ways to represent time – human time vs machine time. Classes such as LocalDateTime and LocalDate represents human time. The Instant class represents machine time. Conversions between date time and java.util.Date must be done via an Instant.

// from LocalDateTime to Date
LocalDateTime dt = LocalDateTime.of(2016, 11, 21, 09, 00);
Instant i = dt.atZone(ZoneOffset.UTC).toInstant());
Date d = Date.from(i);

// from Date to LocalDateTime
i = d.toInstant();
dt = LocalDateTime.ofInstant(i, ZoneOffset.UTC);

You can also compare the documenation of the two libraries on interoperability with java.util.time. The Joda Time one is much shorter and easier to read.

Mule flow variables to JSON payload for REST requests

I was working on a mule flow that submit a static JSON request to a REST endpoint. (The variable part was the id in the URL). My first attempt was to set the JSON request directly using <set-payload>.

<set-variable variableName="orderId" value="#[payload.id]" doc:name="set orderId"/>
<set-payload value="{'note' : 'Order auto-approved by X', 'sendEmail' : true}" doc:name="Set Payload"/>
<http:request config-ref="WS_CONFIG" path="/order/#[flowVars.orderId]/approve" method="POST" doc:name="REST approve request">
  <http:request-builder>
    <http:header headerName="Content-Type" value="application/json"/>
  </http:request-builder>
</http:request>

However, mule refused to submit this request, complaining about ‘Message payload is of type: String’. Most pages I found from googling suggested using the DataWeave Transformer. It can transform data to and from a large range of format, including flow variables into JSON. But the DataWeave Transformer was only available in the enterprise edition. After a frustrating hour of more googling and testing various different transformer, I found another way to achieve this easily by using a expression transformer:

<set-variable variableName="orderId" value="#[payload.id]" doc:name="set orderId"/>
<expression-transformer expression="#[['note' : 'Order auto-approved by X', 'sendEmail' : true]]" doc:name="set payload"/>
<json:object-to-json-transformer doc:name="Object to JSON"/>
<http:request config-ref="WS_CONFIG" path="/order/#[flowVars.orderId]/approve" method="POST" doc:name="REST approve request">
  <http:request-builder>
    <http:header headerName="Content-Type" value="application/json"/>
  </http:request-builder>
</http:request>

The flow I worked on didn’t need the order id in the JSON request. But you can reference flow variables in the payload like this:

<set-variable variableName="orderId" value="#[payload.id]" doc:name="set orderId"/>
<expression-transformer expression="#[['note' : 'Order auto-approved by X', 'id':flowVars.orderId, 'sendEmail' : true]]" doc:name="set payload"/>

Application context XML configuration in a Spring Boot web service

A colleague told me recently he didn’t use Spring for his latest REST project because he couldn’t get the beans defined in a XML configuration file loaded. He was familiar with Spring but had never boot strapped a brand new project. I didn’t realise this could be a problem because I have used Spring MVC for a very long time. He was right. It was not obvious. For example, in the Spring Boot tutorial Building a RESTful Web Service, everything is @Autowired. In a real application, you might need to define some beans in a XML configuration file. For example, database connection information for the persistence layer.

Using the example from my previous post on Spring Boot. You can use the annotation @ImportResource to load XML configuration files.

@SpringBootApplication
@ComponentScan
@ImportResource("classpath:spring-config.xml")
public class Application extends SpringBootServletInitializer {
  public static void main(String[] args) {
    SpringApplication.run(Application.class, args);
  }
}

Spring will auto scan classes annotated with @Service, @Repository, @Controller and @Component. Because Spring AOP is proxy-based, your DAO classes should implement interfaces. For example,

public interface OrderDao {
  Order getOrder(int id) throw OrderNotFoundException;
}
@Repository
public class OrderDaoImpl implements OrderDao {
  private JdbcTemplate jdbcTemplate;
  @Autowired 
  public void setMyDataSource(DataSource myDataSource) {
    this.jdbcTemplate = new JdbcTemplate(myDataSource);
  }
}

For some reason, Spring’s own JdbcDaoSupport class is not autowired enabled. If you choose to extend JdbcDaoSupport, you will need to use XML configuration to set the datasource manually. I prefer to have JdbcTemplate as a member and @Autowired the setter instead.

The datasource is defined in the XML file spring-config.xml. The file is located in src/main/resources in a maven project. (Please use a connection pool in a real application. I’m using BasicDataSource here for simplicity sake).

<bean id="myDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
  <property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver"/>
  <property name="url" value="${db.url}"/>
  <property name="username" value="${db.username}"/>
  <property name="password" value="${db.password}"/>
</bean>

The properties are defined in application.properties, also in src/main/resources.

db.url=jdbc:sqlserver://localhost:1433;DatabaseName=Orders
db.username=Luser
db.password=Pwd

Note: I’m using Spring Boot 1.3.5

Building a Spring Boot RESTful Web Service for Wildfly 8.2

The Spring Boot project promises a easy and fuss free way to build and configure Spring applications. I finally got a chance to try it out today. I needed to build a simple RESTful Web Service to deploy on Wildfly 8.2.

I followed the tutorial at https://spring.io/guides/gs/rest-service/. The tutorial was written for an embedded web server. I needed to make a few tweaks to get my app running on Wildfly.

By adding

<packaging>war</packaging></code>

to pom.xml, I was able to generate a war file using mvn package. However, when I deployed the war file in Wildfly, I got the following exception:

 java.lang.RuntimeException: 
    java.lang.ClassCastException:
      org.apache.tomcat.websocket.server.WsServerContainer cannot be cast to
      io.undertow.websockets.jsr.ServerWebSocketContainer

Spring boot packaged tomcat jars into the war file and they conflicted with Wildfly. I then added the following exclusions to pom.xml.

<dependency>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-web</artifactId>
  <exclusions>
    <exclusion>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-tomcat</artifactId>
    </exclusion>
  </exclusions>
</dependency>

After this change, the web app deployed and started, but could not receive any requests. All GET requests returned 403 forbidden and POST requests returned 405 method not allowed. There was nothing in the log files indicating what was wrong. After a bit of head banging I found out the problem was Wildfly couldn’t forward requests to the web app! I needed to

  1. include servlet-api jars, and
    <dependency>
      <groupId>javax.servlet</groupId>
      <artifactId>servlet-api</artifactId>
      <version>2.5</version>
    </dependency>
    
  2. annotate the main class with @ComponentScan and make it a subclass of SpringBootServletInitializer
    @SpringBootApplication
    @ComponentScan
    public class Application extends SpringBootServletInitializer {
    
        public static void main(String[] args) {
            SpringApplication.run(Application.class, args);
        }
    }
    

That’s it! I was impressed by how little configuration I needed to get my app up and running.

Note: I was using Spring Boot version 1.3.3

Stop accidental git commits of local dev changes to config files

During development, I often make changes to a few configuration files for local testing. Most of the time, I add each file individually into the staging area so these local config changes aren’t committed. Yesterday, I made a mistake and committed the local config. I wasn’t sure how it happened but I must have clicked the commit all tracked files accidentally. The test server was then built with my local config. Oops.

To stop this from happening again, I did some googling and found this handy git command:

git update-index --assume-unchanged <file>

This will temporarily ignore changes in the specified file. All without changing .gitignore which is a tracked file in the project.