Showing posts with label Database. Show all posts
Showing posts with label Database. Show all posts

Saturday, September 12, 2015

Active Directory / Oracle - time stamp handling dilema

I recently received another last minute development request.  I'm going to be a little bit vague on some details on purpose - some things can't be shared.

The general problem to solve is:  On a particular administrative action, a user must complete a specific activity within a particular time frame.  If that activity isn't completed then some data is manipulated to force the user to complete the activity in a timely fashion.  If the activity was completed then flags are cleared regarding the condition.

On first pass through prototyping a possible solution, I recognized it isn't quite as straight forward as hoped.

In this case, we handle several pieces of information from different sources (Oracle and Active Directory). One item is an Oracle date/time (sysdate) generated by the administrative transaction.  We save that "transaction timestamp" in Oracle along with a "to be done by" date which is also stored in Oracle. At that same time, an Active Directory field is indirectly updated to the equivalent of "now". This part of the process works ok and there are no real alternatives available at this time.

At this point in the process, an integration runs which looks for administrative transactions that passed or are at the "to be done by" date and therefore should be checked against the user activity to verify they completed their activity.  This involved comparing an Oracle date/time stored in Oracle against Oracle sysdate - which works well enough and has no issue.

Next we get the user specific last transaction timestamp from Active directory which is represented in Active Directory as a "100ns increment from midnight Jan. 1, 1601".  We then normalize the Oracle transaction timestamp to the same representation as the Active Directory timestamp.  Now in a perfect world, if the Active Directory timestamp is newer than the Oracle transactions timestamp - then the user completed what was required and we clear flags and complete the transaction.Otherwise, we flag the user to complete their task.

The are 2 basic problems though; (1) there is an inherent difference between the Oracle date/time and the Active Directory timestamp - possibly due to one or more causes [activities are only semi-coordinated across servers, possible differences in available precision between Oracle date/time and AD timestamp].  (2) There can be minor differences between system times even when using NTP.

The result of these issues is that in some common circumstances, a user is determined to have completed the transaction just because of the time differences occurring because activities are serialized across different systems but using each systems time.  This I can easily see in the test data I generated.  I have not knowingly run into an issue with differences in the actual clocks in this current situation but we have had previous problems with clocks being out-of-sync.

The "cost" in this situation is significantly higher than desirable if users are determined to be "incomplete" when they actually are "complete".  On that same note, for other reasons it is in the organizations best interest to be as accurate as reasonable. 

[edit 2015/10/03]
Sourcing the time stamps only from AD isn't possible - initially I thought maybe it could be. The final solution isn't too hard to implement.  First I had to determine how close my times had to be to meet business needs.  In this case, I determined that the one use case affected would be fine with 10 seconds of accuracy.  The way I implemented that was to take the transaction time and activity time and subtract them.  If you take the absolute value of that and compare it against 10 seconds, I know whether the user met the timing requirement.  Problem solved.  If the 10 second value is externally configurable, I can easily update the behavior as business requirements change.





Wednesday, February 27, 2013

Java / Struts2 I18N and DB backed ResourceBundle

I am a strong believer in deployment processes which don't involve making a patchwork of changes to a previously deployed WAR file in a production environment.  The downside to that is things like minor changes to properties files require a full deployment (and across a number of servers in this case).  The time consumed by this combined with the fact that the normal maintenance window is at a time I would rather spend with the family drove me to look for other solution.  After some searching, it seemed that the most reasonable solution fitting our needs involved moving the data to the database and writing some code to leverage the ResourceBundle framework.  It was hoped that struts would seamlessly work with this.  That didn't turn out quite true.

For the moment, I ended up having to force the pre-load of the Resource bundles (English and Spanish) before they seemed to be found fairly reliably. There is some odd behavior when working with our custom ResourceBundle.Control which could be a root cause.  There are still some problems where some application areas seem to not pickup the Spanish data and this may be an issue with struts.  Further debugging is required - hopefully this can be resolved fully even if no optimal solution is found.

I truly wish that DB backed ResourceBundles were supported directly by struts 2.  I believe some other newer frameworks support this - maybe it is time to revisit framework decisions.

[2015/11/8] Notes added below..

My current use of this works by pre-loading the bundle from the DB before the first real need. After that point, the cached bundle is returned during request from struts, etc.  I am using the Spring framework to initialize/load the ResourceBundles early in the web application startup - thereby getting the bundles cached.

Link(s) I think I found/used regarding DB backed resource bundles originally.
DB backed resource bundle reference 1

Potential Issues:
For ResourceBundle.Control the Java doc for the "needsReload" method says
"The calling ResourceBundle.getBundle factory method calls this method on the ResourceBundle.Control instance used for its current invocation, not on the instance used in the invocation that originally loaded the resource bundle. "  
and the "getTimeToLive" method Java doc says
"All cached resource bundles are subject to removal from the cache due to memory constraints of the runtime environment. "
This seems to leave the possibility that a bundle could be dropped from the cache and would not reload properly since it wouldn't access the correct ResourceBundle.Control instance which is only used during the application initialization.  I ran into something that acted like this (but without memory pressure that I am aware of) with the result of getting resource not found exceptions. I'll be looking into this at some point.

I would really like to propose some Java change that would prevent the above potential issue but some aspects of the JCP membership agreement and my employer make that difficult.  I had also considered that maybe a change at the JSF2 (my current focus) layer could work but have not looked into it any further.

[2016/05/19] New info.
I ran across something useful and likely better than my original design.  It is facilitated by the new as of JDK 8 ResourceBundleControlProvider interface.  See the Oracle docs here. When time permits or the need arises (which might be soon) I will give this method a shot.

As an alternative, I have considered creating a ListResourceBundle that still uses a database for the data.  The downside of this is still the need of a class per bundle because resolving a bundle is done by looking for a class name reflecting the bundle name and locale. A superclass could probably be created which does all/most of the heavy lifting and sub-classes would exist mainly for Java logic performing the bundle to class resolution..

The more I think about it, the more I think the ResourceBundleControlProvider is probably the better way.  It will be more complex - without further research I am concerned about getting the database connectivity into provider early enough to be useful and without having class loader issues.  Much of my DB connectivity uses Spring based configuration and involves a number of dependencies.  I'd hate to create a new configuration method to get connections into the provider but it could be needed.  I'll have to prototype it to verify.