I am trying to define a canonical data model for a distributed platform I'm working on, which is based on the following architecture:
- A RESTful API façade, exposing functionalities to a client
- An underlying middleware, based on Apache Camel and used to route and transform the client requests
- RESTful business services invoked by the middleware
The idea is to let the façade layer translate the incoming requests into a common information model, which has the form of a custom message with header and payload:
- The header should contain the information needed by the middleware (Apache Camel) in order to route the message to the requested workflow (so basically the façade knows which business process should be invoked in order to handle each incoming request from the client). It can be modelled as an enum map or as a set of attributes for the the class representing the custom message.
- The payload should contain a proper Java bean representing the "business model" for the incoming request (e.g. TicketOrder or Customer object). It can be modelled as an Object attribute in the class representing the custom message, and all the processors/converters of Camel involved in the workflow managing that message should be expecting that payload type (the selected Java bean).
In few words, I am trying to define a business data model for the middleware which contains only the relevant information needed by Camel to process the incoming requests and route them to the business services. This data is modelled as a Java bean and attached as a payload to the message, whose header contains the routing details which are meaningful to Camel.
How would you improve the above solution? Would you say it is a good approach and flexible enough? Thank you very much.
If you are using Camel as (light weight) ESB, I think there isn't too much things to worry. Inside the camel routes you could just keep the format that best suits the case, you can even have more than one format running in parallel for some solutions (e.g. you could use the Java Bean to do some business logic but you could also use XSTL to transform the message into XML and XPath to get a better decouple and do some processing CBR).
If you need to work a lot with the Java Bean model, I would recommend you to create a project that you could extract the model from the JSON, so you could control the version (with Maven, for example) of this artifact and that could be the canonical model for the Java applications (you would need to update this project if the JSON requests changes a lot). Creating the Java data model project and working with OSGi container would give you the benefits to decouple even more due the OSGi could use different versions of model for different applications.
Regards, Luan