Friday, April 17, 2015

ServiceMix - first prod integration deployed

I have been pushing to replace WebMethods within our organization for a few years now and this is finally the first step - getting ServiceMix into production.  Unfortunately, this initial integration isn't  replacing any WebMethods integration but it got ServiceMix into production though.

Some minor details on the integration.

Tech used:
  • Linux
  • Java 8 
  • ServiceMix 5.4.x
  • Camel
    • SQL Component
  • PeopleSoft psjoa.jar and the associated jar of generated component interface definitions
The biggest challenge was getting the PeopleSoft aspects usable in a real OSGI context.  It isn't perfect but it is functional. 


Here is a simple diagram of the OSGI related dependencies.
Since PeopleSoft integrations using psjoa.jar require using the jar that exactly matches the version of PeopleTools in use; I used the PeopleTools version for the OSGI version of the exported packages.  I used the maven bndtools plugin to convert the jar into an OSGI bundle.  The only painful part of this is that I ran into a need to utilized the dynamic-import to pick of some internal references which were causing problems otherwise.  The interesting aspect is that what was picked up appears to be things like JMS items and similar things.  I am guessing that psjoa.jar does some stuff with Class.forName() with regard to some optional functionality it provides and it still causes OSGI issues for some unknown reason.
 
 The PeopleSoft component interface definitions were a much larger headache.  The code is generated and you have no control of the code/package naming - all CI files are produced into the package: PeopleSoft\Generated\CompIntfc. 

 The CI files are not necessarily tied to a particular PeopleTools version - the file can be used for later PeopleTools versions as long as nothing is structurally different between CIs defs and the PeopleSoft server side definition that you use.  The main issue here is that if you have multiple PeopleSoft ERP systems (i.e. HR, Financials, etc) then the set of  CI definitions is different but the Java side package must be the same.  This made it harder to support multiple PeopleSoft systems (with different PeopleTools versions) concurrently in one ServiceMix instance.  I am not  trying to run multiple integrations like this at the moment but I need to.  I think I worked about the problem but it required the use of "required-bundle" - so Finance CI definitions go into a jar like finance-ci-<version>.jar and HR definitions go into a jar like hr-ci-<version>.jar.  They both contain the same java packages so in dependent integrations I import the package and I must also be sure to do a require-bundle on the jar for the system I am interested in.  I think I have it working but need to do some further validation.  Without doing this, I would likely have to do some sort of version scheme which distinguishes the various ERP systems of interest - that seemed awkward as well so for now I am doing it this way.

Back to the overall integration.  Nothing fancy, some database triggers generate records into a table which acts as input to the ServiceMix integration.  ServiceMix (Camel SQL) has a route which runs a regular SQL against the table and selects the data into the route.  The route takes the data and does some work with the PeopleSoft instance and if no error/exception resulted then the route marks the originating data as processed.  If something goes wrong, the route marks the failed row as failed processing. 

One thing that isn't working for  me right now is that I intended to have the route delete the source rows on success but that is not working at the moment - not sure why.  I am wondering if it is a bug in Camel.  I did check the Camel unit tests and there is a test matching basically what I am doing but the test is done via Java code versus using Blueprint - it seems like it shouldn't matter but for lack of a better answer.  Or it could be differences in database/drivers (maybe far fetched - not doing anything wild here).  I did note in documentation that it looks like there are so limitations on the number of parms to the queries specified to the SQL/DML.  I just don't have time to debug it for now.  I will probably write a quick batch job to run every so often and cleanup the completed data and  generate a report on failures, counts, timings, etc.

 There are still a number of setup changes which will likely occur to the environment as we implement more complex integrations but at least we started the process - I am very thankful for that.

Thanks for checking this out; hope it was interesting and maybe even helpful in some way.

God bless!

Scott


No comments:

Post a Comment