Friday, 19 June 2015

IBM Integration Bus - Developing my interest ( and an SOA / ESB solution )

I have blogged about IBM Integration Bus (IIB) a fair bit recently, as I develop my expertise.

Most recently, I wrote: -


specifically as I'm busy developing an integration demonstration for my client, comprising IBM Business Process Manager, IBM Integration Bus, IBM WebSphere MQ and IBM DB2.

Today's post describes my experience creating a really really simple flow in IIB.

In the first instance, this flow represents a service that exposes a function from my mythical System of Record, actually a DB2 database table called EMPLOYEE.

My objective is to have this service running inside my Enterprise Service Bus (ESB), which is being delivered by IIB, and be callable in an asynchronous AND synchronous manner.

Initially, my service ( flow ) is only exposed via WebSphere MQ.

In principle, I have an Integration Node, called IB9NODE, which hosts an Integration Server, called IIB9, and a Queue Manager, called IB9QMGR.

The Queue Manager hosts a pair of Queues, CUSTOMER.INPUT and CUSTOMER.OUTPUT.

I will script the definition of the IIB components and the Queue Manager at a later date, to start with, I created them using the IIB Toolkit ( I'm using IIB 9 which has an automatic dependency on an underlying Queue Manager ).

Having set up IIB and the Queue Manager, I created my Queues: -

defineQueues.msc 

DEFINE QLOCAL(CUSTOMER.INPUT)
DEFINE QLOCAL(CUSTOMER.OUTPUT)


runmqsc IB9QMGR < defineQueues.msc 

5724-H72 (C) Copyright IBM Corp. 1994, 2014.
Starting MQSC for queue manager IB9QMGR.


     1 : DEFINE QLOCAL(CUSTOMER.INPUT)
AMQ8006: WebSphere MQ queue created.
     2 : DEFINE QLOCAL(CUSTOMER.OUTPUT)
AMQ8006: WebSphere MQ queue created.
       : 
2 MQSC commands read.
No commands have a syntax error.
All valid MQSC commands were processed.

My flow effectively joins the two Queues together, by way of a Compute Node. This Compute Node uses a Database Service, which does the clever stuff i.e. connecting to DB2 via ODBC and exposing one or more SQL operations - I'm "merely" doing a SELECT.

So, logically the client, be that MQ, the IIB Toolkit or, in the future, a BPEL flow hosted by IBM BPM, puts a message onto the CUSTOMER.INPUT Queue, the flow does the heavy lifting to retrieve the selected row from DB2, and places the output onto the CUSTOMER.OUTPUT Queue.

This is what my flow looks like: -

My Input Node is configured to use the CUSTOMER.INPUT Queue: -

which is configured parse a message in the JSON format: -


Here's an example message in JSON format: -

{"EmployeeID":"000100"}

My Compute Node is configured to bind to the SAMPLE ODBC datasource ( see my previous post re ODBC fun and games ): -


The Compute Node contains some basic generated ESQL, which I've subtly modified: -

PATH DatabaseService.EMPLOYEE_OPS_GROUP, DatabaseService1.EMPLOYEE_OPS_GROUP;

CREATE COMPUTE MODULE customerService_Compute
CREATE FUNCTION Main() RETURNS BOOLEAN
BEGIN

DECLARE dbResultSet ROW;
DECLARE dbResultSetRef REFERENCE TO dbResultSet;

DECLARE rowRef REFERENCE TO dbResultSetRef.row;

DECLARE empno CHARACTER;
SET empno = InputRoot.JSON.Data.EmployeeID;
CALL retrieveEmployee(empno, dbResultSetRef);

SET OutputRoot.XMLNSC.EMPLOYEE = rowRef;
 
RETURN TRUE;
END;

CREATE PROCEDURE CopyMessageHeaders() BEGIN
DECLARE I INTEGER 1;
DECLARE J INTEGER;
SET J = CARDINALITY(InputRoot.*[]);
WHILE I < J DO
SET OutputRoot.*[I] = InputRoot.*[I];
SET I = I + 1;
END WHILE;
END;

CREATE PROCEDURE CopyEntireMessage() BEGIN
SET OutputRoot = InputRoot;
END;
END MODULE;


I've highlighted the code that I added to the generated ESQL.

Of that, this is what the IIB Toolkit gave me when I created the Database Service against the SAMPLE database: -

DECLARE dbResultSet ROW;
DECLARE dbResultSetRef REFERENCE TO dbResultSet;

DECLARE rowRef REFERENCE TO dbResultSetRef.row;

CALL retrieveEmployee(empno, dbResultSetRef);


and this is what I added: -

DECLARE empno CHARACTER;
SET empno = InputRoot.JSON.Data.EmployeeID;
SET OutputRoot.XMLNSC.EMPLOYEE = rowRef;


The first line: -

DECLARE empno CHARACTER;

sets up a variable called, imaginatively, empno as type CHARACTER.

The second line: -

SET empno = InputRoot.JSON.Data.EmployeeID;

assigns the value of the JSON object EmployeeID to the empno variable - the InputRoot "variable" relates to the entire incoming MQ message, and uses the JSON parser to retrieve the EmployeeID object.

The third line: -

SET OutputRoot.XMLNSC.EMPLOYEE = rowRef;

does almost the reverse - it assigns the value of rowRef ( as retrieved by the Database Service and stored as a row in a Result Set ) to the OutputRoot "variable", using the XMLNSC parser to create the EMPLOYEE XML message.

It's this EMPLOYEE message that is put onto the outgoing CUSTOMER.OUTPUT Queue.

Finally, here's me testing the flow, using the MQ samples amqsput and amqsget : -

Input

/opt/mqm/samp/bin/amqsput CUSTOMER.INPUT IB9QMGR

Sample AMQSPUT0 start
target queue is CUSTOMER.INPUT
{"EmployeeID":"000100"}
{"EmployeeID":"000200"}


Output

/opt/mqm/samp/bin/amqsget CUSTOMER.OUTPUT IB9QMGR

Sample AMQSGET0 start
message <<EMPLOYEE><row><EMPNO>000100</EMPNO><FIRSTNME>THEODORE</FIRSTNME><LASTNAME>SPENSER</LASTNAME></row></EMPLOYEE>>
message <<EMPLOYEE><row><EMPNO>000200</EMPNO><FIRSTNME>DAVID</FIRSTNME><LASTNAME>BROWN</LASTNAME></row></EMPLOYEE>>


I then went a little further, and changed my Compute Node to return a JSON object: -

SET OutputRoot.JSON.Data.Employee = rowRef;

Having redeployed the flow to the Integration Server ( on the Integration Node ) from the Toolkit, I re-tested it using the MQ samples: -

/opt/mqm/samp/bin/amqsput CUSTOMER.INPUT IB9QMGR

Sample AMQSPUT0 start
target queue is CUSTOMER.INPUT
{"EmployeeID":"000100"}
{"EmployeeID":"000200"}

/opt/mqm/samp/bin/amqsget CUSTOMER.OUTPUT IB9QMGR

Sample AMQSGET0 start
message <{"Employee":{"row":{"EMPNO":"000100","FIRSTNME":"THEODORE","LASTNAME":"SPENSER"}}}>
message <{"Employee":{"row":{"EMPNO":"000200","FIRSTNME":"DAVID","LASTNAME":"BROWN"}}}>

So, that's it for now ....

Next I need to create my "client" which will be an SCA module hosted on IBM BPM Advanced ( aka Process Server ) using a BPEL flow, which will use JMS to post the input message onto the CUSTOMER.INPUT Queue, and monitor the CUSTOMER.OUTPUT Queue for the resulting message.

I'll create the SCA module using IBM Integration Designer, and leverage the built-in Process Server integrated test environment.

When time allows, I'll then create a BPMN Process Application ( using Process Designer ) which will again be hosted on a BPM Advanced Process Server. This BPMN application will provide the user interface, where the end-user ( perhaps a service agent in a call centre or perhaps an employee via a self-service mobile application ) would enter the required employee ID and get back the resulting record.

From little acorns do large oak trees grow ......

For reference, we have this: -



and, purely for the record, there's a nice video here: -



No comments:

Reminder - installing podman and skopeo on Ubuntu 22.04

This follows on from: - Lest I forget - how to install pip on Ubuntu I had reason to install podman  and skopeo  on an Ubuntu box: - lsb_rel...