Showing posts with label JDBC. Show all posts
Showing posts with label JDBC. Show all posts

Friday, April 17, 2015

ServiceMix - first prod integration deployed

I have been pushing to replace WebMethods within our organization for a few years now and this is finally the first step - getting ServiceMix into production.  Unfortunately, this initial integration isn't  replacing any WebMethods integration but it got ServiceMix into production though.

Some minor details on the integration.

Tech used:
  • Linux
  • Java 8 
  • ServiceMix 5.4.x
  • Camel
    • SQL Component
  • PeopleSoft psjoa.jar and the associated jar of generated component interface definitions
The biggest challenge was getting the PeopleSoft aspects usable in a real OSGI context.  It isn't perfect but it is functional. 


Here is a simple diagram of the OSGI related dependencies.
Since PeopleSoft integrations using psjoa.jar require using the jar that exactly matches the version of PeopleTools in use; I used the PeopleTools version for the OSGI version of the exported packages.  I used the maven bndtools plugin to convert the jar into an OSGI bundle.  The only painful part of this is that I ran into a need to utilized the dynamic-import to pick of some internal references which were causing problems otherwise.  The interesting aspect is that what was picked up appears to be things like JMS items and similar things.  I am guessing that psjoa.jar does some stuff with Class.forName() with regard to some optional functionality it provides and it still causes OSGI issues for some unknown reason.
 
 The PeopleSoft component interface definitions were a much larger headache.  The code is generated and you have no control of the code/package naming - all CI files are produced into the package: PeopleSoft\Generated\CompIntfc. 

 The CI files are not necessarily tied to a particular PeopleTools version - the file can be used for later PeopleTools versions as long as nothing is structurally different between CIs defs and the PeopleSoft server side definition that you use.  The main issue here is that if you have multiple PeopleSoft ERP systems (i.e. HR, Financials, etc) then the set of  CI definitions is different but the Java side package must be the same.  This made it harder to support multiple PeopleSoft systems (with different PeopleTools versions) concurrently in one ServiceMix instance.  I am not  trying to run multiple integrations like this at the moment but I need to.  I think I worked about the problem but it required the use of "required-bundle" - so Finance CI definitions go into a jar like finance-ci-<version>.jar and HR definitions go into a jar like hr-ci-<version>.jar.  They both contain the same java packages so in dependent integrations I import the package and I must also be sure to do a require-bundle on the jar for the system I am interested in.  I think I have it working but need to do some further validation.  Without doing this, I would likely have to do some sort of version scheme which distinguishes the various ERP systems of interest - that seemed awkward as well so for now I am doing it this way.

Back to the overall integration.  Nothing fancy, some database triggers generate records into a table which acts as input to the ServiceMix integration.  ServiceMix (Camel SQL) has a route which runs a regular SQL against the table and selects the data into the route.  The route takes the data and does some work with the PeopleSoft instance and if no error/exception resulted then the route marks the originating data as processed.  If something goes wrong, the route marks the failed row as failed processing. 

One thing that isn't working for  me right now is that I intended to have the route delete the source rows on success but that is not working at the moment - not sure why.  I am wondering if it is a bug in Camel.  I did check the Camel unit tests and there is a test matching basically what I am doing but the test is done via Java code versus using Blueprint - it seems like it shouldn't matter but for lack of a better answer.  Or it could be differences in database/drivers (maybe far fetched - not doing anything wild here).  I did note in documentation that it looks like there are so limitations on the number of parms to the queries specified to the SQL/DML.  I just don't have time to debug it for now.  I will probably write a quick batch job to run every so often and cleanup the completed data and  generate a report on failures, counts, timings, etc.

 There are still a number of setup changes which will likely occur to the environment as we implement more complex integrations but at least we started the process - I am very thankful for that.

Thanks for checking this out; hope it was interesting and maybe even helpful in some way.

God bless!

Scott


Sunday, September 21, 2014

Jenkins and Groovy - ERP data quality and validation

Many organizations have an ERP and they may manage large amounts of data.  I won't bother debating the tradeoffs of commercial ERP systems but I want to mention some ways I am dealing with certain data issues.

Our ERP is one that has some pretty old roots (and they show their grey) - with the changes in technology, schema, functionality, etc there are plenty of issues in data validation by the system.  Each new vendor maintenance release is like rolling the dice in regards to what validations change (for better or worse).  On top of that, the vendor process for major upgrades doesn't really handle conversion and/or reporting of some data and it is usually tables which really need extra work.  There are also business process and training issues within the distributed organization.  All of this results in many data messes.

Data issues on their own are bad enough but the way the problems tend to manifest for us provide a very bad experience for our customers.  In our case, the vendor technology manages data in a way where some data which is not actively being used (i.e. by an end user) is still operated on and validated as part of some transaction.  The problem is that when the actively used/accessed data is saved, the system also performs validations on the non-actively used data.  If that non-actively used data is invalid then the system throws a tantrum which usually has no useful message for the end-user since the data at fault is not in anyway related to what they did.  This is especially true when you have a custom application accessing the system - useful error responses from the source system are a rarity.

So what do you do?

I am slowly putting Jenkins jobs which perform checks and validations for some of the common problems.  I write the jobs to identify bad data, form a report of some sort and email it to particular users whom I hope are able to direct it to an appropriate person for correction/mitigation, etc.

The solution I am documenting here is the use of Jenkins with Groovy using JDBC to perform a check, report generation and emailing of any failure result.

I am modifying things somewhat to take organization identifying type information out.

import groovy.text.SimpleTemplateEngine
import javax.naming.*
import javax.sql.DataSource
import javax.mail.internet.InternetAddress
import javax.mail.internet.MimeMessage
import javax.mail.*
import javax.activation.*
import groovy.sql.Sql
import static java.util.Calendar.*
 
class PlanValidations

    // Produce a connection to the database
    def connect()
    {
          Context initCtx = new InitialContext();
          DataSource sqlDS = initCtx.lookup("jdbc/connid");
          def sql = new groovy.sql.Sql(sqlDS);
          return sql;
    }

        // An array to contain the column names returned by the query
    def colnames = []
 
    def checkPlanInactivation(sql)
    {
            // The query is looking for a misconfiguration which is typically a failure to follow the proper business process.
            // The correct process for disabling the data item is add a new row with a current effective date which states which
            //  period the data should be disabled starting at.  The query finds each max effective item plus several "lagging" fields for which an outer query compares to determine if the proper setup was used.
         
      def badInactivePlanSQL = "
SELECT *
FROM
  (SELECT INSTITUTION,
    FIRST_PERIOD_VALID,
    PROG,
    PLAN,
    LAST_PERIOD,
    LAG(LAST_PERIOD, 1, ' ')
      OVER (PARTITION BY INSTITUTION, PROG, PLAN ORDER BY EFFDT ASC)
        PREV_LAST_PERIOD,
    EFF_STATUS,
    LAG(eff_status, 1, ' ')
      OVER (PARTITION BY INSTITUTION, PROG, PLAN ORDER BY EFFDT ASC)
        PREV_eff_status,
    effdt,
    LAG(effdt, 1, NULL)
      OVER (PARTITION BY INSTITUTION, PROG, PLAN ORDER BY EFFDT ASC)
        PREV_eff_dt
  FROM PLAN_TBL
  ORDER BY INSTITUTION,
    PROG,
    plan,
    effdt
  )
WHERE EFF_STATUS    = 'I'
AND prev_eff_status = 'A'
AND prev_PERIOD = ' '
AND EFFDT           > SYSDATE 
           ";

   
         // This will be a map of lists  {inst -> [rows]}
      def institutionBadSetup = [:].withDefault {[]} 
            
      sql.eachRow(badInactivePlanSQL,
                  // keep track of the column names returned by the query
         { meta ->
           (1..meta.columnCount).each { colnames << meta.getColumnName( it ).toLowerCase() }
         },
                 // save off the bad data by institution
         {
             institutionBadSetup[it.INSTITUTION] << it.toRowResult()
         }
      )
      return institutionBadSetup
    }

    def doChkInactivations()
    {
         def conn = connect();        
         def invalidPlanSetups = checkPlanInactivation(conn)
                   // take all the results and send off an enail per institution
         invalidPlanSetups.each() {
           institution, lst -> sendEmail(getEmailTo(institution, conn), getSubject(institution), getMsg(institution, lst))
         }     
        
         conn.close();        
    }

        //  Produce an html message with description of problem, related data and hints at fixing.
    def getMsg(institutionId, plans)
      {
       def html = """\
         <html>
         <head></head>
         <body>
         There is a problem with the setup of following plans for $institutionId.<br/><br/> 
       
          *** Provide a message here which describes the problem and provides help in correcting. ***
        <br/><br/>
         """
     
            // use the saved off column names and match up with the field data while constructing the simple report
       plans.each() { plan ->
          (0..colnames.size()-1).each {
            println plan.PLAN + " " + it
            html += "${colnames[it]}[${plan[it]}] "
          }
         html +="<br/><br/>"
       }
     
       html += """\
        </body>
        </html>""";
     
        return html
      }
 
    // Produce a subject for the email
    def getSubject(institutionId)
      {
      def env = System.getenv()
      String environ = env['ENV_TYPE']
      String hostname = env['HOSTNAME']
     
      return "invalid INACTIVE plan setup - institution(${institutionId}) <system name>(${environ}) server(${hostname})"
      }
 
      // Use an existing table of users to provide i
      def getEmailTo(institutionId, sql)
      {
      // hard coded a user [me] as email "to" during R&D
      def emailSql = "select 'me@someplace.org' as cntct_eml from inst_misc where institution = '${institutionId}'"
      def emailAdr = sql.firstRow(emailSql)
      return emailAdr.vx_oa_admcntct_eml
      }
 
    def sendEmail(receiver_email, email_subject, msgtext)
    {       
        def reportdt = String.format('%tY/%<tm/%<td  %<tH:%<tM:%<tS', Calendar.instance);       
 
        println "$msgtext";

        Properties mprops = new Properties();
        mprops.setProperty("mail.transport.protocol", "smtp");
        mprops.setProperty("mail.host", "smtp.somewhere.org");
        mprops.setProperty("mail.smtp.port", "25");
        Session lSession = Session.getDefaultInstance(mprops, null);
        MimeMessage msg = new MimeMessage(lSession);
       
        msg.setText(msgtext, "utf-8", "html");
       
        def me = "sender@somewhere.org";
        StringTokenizer tok = new StringTokenizer(receiver_email, ";");
        
        ArrayList emailTos = new ArrayList();
        while(tok.hasMoreElements())
        {
           emailTos.add(new InternetAddress(tok.nextElement().toString()));
        }
        
        InternetAddress[] to = new InternetAddress[emailTos.size()];
        to = (InternetAddress[]) emailTos.toArray(to);
        msg.setRecipients(MimeMessage.RecipientType.TO,to);
        
        msg.setSubject(email_subject);
        msg.setFrom(new InternetAddress(me));
       
        //Send the message via our own SMTP server, but don't include the envelope header.
        Transport transporter = lSession.getTransport("smtp");
        transporter.connect();
        transporter.send(msg);
     }
}

validationChk = new PlanValidations();
validationChk.doChkInactivations();