Oracle Directory Manager and Application Development

Oracle Directory Manager is a Java-based tool for administering Oracle Internet Directory (LDAP). The Oracle Directory Manager is the main directory administration tool and it is installed with Oracle Internet Directory.

When developing applications there is often one or more central LDAP directories for developers to use. When working on a new application it is often necessary to reset entries, test scenarios, etc. However, it is unlikely that everyone’s desktop will have the entire OID installation. On my desk alone there are 3 desktop machines, and one laptop and non of them have the full Identity Management stack. In fact, the laptop is from when I was with Siebel so, although it was manufactured this century, it has little more than JDeveloper, Thunderbird and Oracle Calendar running on it.

One easy way to have Oracle Directory Manager on every developer’s machine, but not having to install anything else, is to take advantage of the fact that it is a Java application.

To achieve this, copy some jars (over 15 of them!) from the ORACLE_HOME/jlib and the ORACLE_HOME/ldap/oidadmin directories to a directory on your PC. Let’s call it oidadmin. Keep the directory structures. The entire list of jars is below. The main class is oracle.ldap.admin.client.NavigatorFrame and there are a few parameters that need to be passed to it. The entire command line is too long to type, let alone remember, so put it all in a file called oidadmin.cmd (when on windows) in the same oidadmin directory.

oidadmin.cmd

java
-ms4m
-mx128m
-Dsun.java2d.noddraw=true
-Dsun.java2d.font.DisableAlgorithmicStyles=true
-classpath "./ldap/oidadmin/osdadmin.jar;
./jlib/netcfg.jar;
./jlib/help4.jar;
./jlib/help4-nls.jar;
./jlib/oracle_ice.jar;
./jlib/jewt4.jar;
./jlib/share.jar;
./jlib/ewt3.jar;
./jlib/ewt3-nls.jar;
./jlib/ewtcompat-3_3_15.jar;
./jlib/swingall-1_1_1.jar;
./jlib/dbui2.jar;
./jlib/dbui2-nls.jar;
./ldap/oidadmin/oidldap.jar;
./ldap/oidadmin/netutil.jar;
./jlib/oemlt-9_0_2.jar;
./jlib/ldapjclnt10.jar"
oracle.ldap.admin.client.NavigatorFrame
-AdminRoot:Start
-ldap
-AdminRoot:End
-LDAPRoot:Start
-meta
-ohhome
"."
-LDAPRoot:End

The above is formatted for readability and should be all on the one line. On windows I create shortcut on the desktop to the command file. The final touch is to use the OID Directory Manager icon for the shortcut. Any machine with Java can become a OID Directory Manager machine which I have found really useful for demonstrations and collaboration with developing new solutions.

The differences between Cheque and Check


Banking is old, very old. The first banks were probably the religious temples of the ancient world, as long as 5,000 years ago. Banks probably predated the invention of money. The current modern western financial products and services can be traced back to the coffee houses of London. Even that was a long time ago. The London Royal Exchange was built in the 16th centuary!

Although North American and Northern European banking share a similar beginning, as you would expect, down through the years, differences in certain practices have emerged. The most obvious difference is the spelling of Cheque. While the rest of the planet uses the term Check, the Common Wealth Nations, and Ireland, use the less ambiguous spelling.

Another notable difference between banking practices on opposite sides of the Atlantic is the use of checks. The use of checks in Europe have been in decline over the past 20 years. Only Ireland, Britain and France use checks to any significant degree. With the advent of debit cards and electronic funds transfers the checkbook has all but disappeared. I’m certain there are people working in German banks that have never seen a EuroCheque.

One of the things I liked about banking in the US, when I was living in Boston, was the range of personalised checkbooks available. The use of personalised checkbooks, for regular retail customers, is one of the nice touches to banking in North America which one does not get in Europe.

Not only can a customer be issued personlised checkbooks, but they can print their own! The use of computer checks is steadily growing particularly by sole traders & small businesses which would not normally qualify for big business perks with their banks. Computer checks are cost effective and make a really powerful, professional impression. Many of the accounting and money management software packages in use today support the printing of checks. There are also preprinted paper stock available for use with them, such as QuickBooks Checks and Quicken Checks.

Infact, there is a huge business built up around the humble check in the USA. Perhaps this is the reason the check has not disappeared from the North American banking system to the extent it has in Europe.

LifeLock – Identity Theft Protection

Although the FFIEC advices against it, many banks, particularly in the USA, still use single factor authentication for most, if not all of their services. The banks do, however, implement a number of pattern and behaviour matching in an attempt to find account fraud and identity theft. This is somewhat reassuring until you realise that bank staff, and government agency employees, have been known to loose laptops with customer data, and worse still, not follow their own corporate policies on data protection.

Fraud and identity theft protection is a consumer, as well as corporate issue. In my day job I focus on the corporate solutions, but there are consumer solutions out there too. One consumer solution of note is Life Lock, which provides some novel approaches to tackling this problem. These include registering, and continually registering, fraud alerts with credit bureaus, monitoring address changes and a $1,000,000 guarantee to cover costs of restoring things to their proper state if a fraud does take place. The fact that they can put an end to getting those annoying pre-approved credit letters may well be the most significant immediate value for some.

One of the interesting automated services provided is the recently announced eRecon, which trawls the murky underbelly of the Internet to see if your personal information, or a snippet of same, shows up in the identity thieves’ marketplaces. I guess you could call it the Black Ops of identity theft protection.

My point is that not only is multi-factor authentication a must, but multi-factor identity protection, both corporate and consumer, is a must in the information age.

1 Year Old Today!

It was exactly one year ago that the SOA Station blog went on air. For me, it is really interesting to look back over the content posted and compare it to what was originally envisaged. What is immediately obvious is the broad range of topics within the SOA subject: Security, Governance, Scalability, Quality & Reliability, and of course Interoperability.

Over the entire year, according to Google Analytics, the site received 28.11 visits per day. There was a dramatic increase in September when the blog was listed on Oracle Blogs. Visits per day since then works out as 39.13. Since the beginning of this year, the visits per day is at 51.58. This increase is really encouraging. People are obviously accessing the blog from work as the number of site visits on a Saturday and Sunday are tiny.

The top content (most number of views) is:

  1. SCA Diagram stencil for Visio
  2. Java IBAN check digit validation
  3. Custom XSLT functions in Oracle BPEL and ESB
  4. Oracle Lite and SOA Suite
  5. JAX-WS & JAXB rock and roll…

One thing that did surprise me was the level of interest in the industry specific functionality, rather than the generic technical content. The IBAN Check Digit Validation article is one of the post popular on the site. I’ll certainly look into providing more industry specific content. Suggestions welcome.

Running web service clients without a web service

One of the challenges with large scale Enterprise Application development is dealing with the dependencies between teams and parts of the system being developed. Despite Agile software development methodologies a waterfall style of producing artifacts can occur. For example, the UI can not be completed because the web services are not built and the web services are not built because the data model is not complete. The data model is not complete because there are outstanding questions that the customer hasn’t answered.

In an ideal world we would all be able to agree the integration interfaces up front and then farm out the development effort so that the UI and Server Side teams can get coding straight away. One way to make this happen is for the UI team to define the WSDL interfaces for the services they plan to invoke. It is good practice for the ‘caller’ to define the interface where possible.

The UI team could also go ahead and provide their own simple implementation. Included in this article is some code for a servlet that accepts a SOAP Envelope request and returns a SOAP Envelope response. It uses the element name in the SOAP Body to look up an XML file with the same name in the classpath and then returns the contents. A client can define a WSDL, set the URL for the Servlet as the SOAP address and provide a ‘canned’ response XML file for each operation. There is also an Enterprise Service Bus (ESB) approach that is outlined as an alternative at the end of the article.

Simple SOAP Servlet
The servlet uses the JAXP libraries…


DocumentBuilderFactory builderFactory
= DocumentBuilderFactory.newInstance();
DocumentBuilder builder = null;

public void init() throws ServletException {
try {
this.builderFactory.setNamespaceAware(true);
this.builder = this.builderFactory.newDocumentBuilder();
} catch (ParserConfigurationException e) {
throw new ServletException("Error initialising servlet", e);
}
}

A mechanism to get the SOAP Body from the request is needed. This method is part of that. It is inelegant as getElementsByTagName() just did not work during testing. Probably due to some configuration issue in my environment or codebase. Since the SOAP Envelope might contain a Header element, the Body element may be the second element.


private Element getSOAPBodyElement(Element requestSOAPEnvelope) {
Element firstChild = (Element) requestSOAPEnvelope.getFirstChild();
if (firstChild.getLocalName().equalsIgnoreCase("body")) {
return firstChild;
}
Element secondChild = (Element) firstChild.getNextSibling();
if (secondChild.getLocalName().equalsIgnoreCase("body")) {
return secondChild;
}
return null;
}

The following method is where the real work is done. Most of the effort is spent on making sure that the request is a SOAP Envelope.


private QName getRequestPayLoadQualifiedName(HttpServletRequest request)
throws ServletException, IOException
{
try
{
Document requestXML = this.builder.parse(request.getInputStream());
Element requestSOAPEnvelope = requestXML.getDocumentElement();
if (!requestSOAPEnvelope.getLocalName().equalsIgnoreCase("envelope"))
{
throw new ServletException(
"Unable to parse request. Are you sure it is a SOAP Envelope?"
);
}

Element requestSOAPBody = this.getSOAPBodyElement(requestSOAPEnvelope);
if (requestSOAPBody == null) {
throw new ServletException(
"Unable to parse request. Body element not found. Are you sure it is a SOAP Envelope?"
);
}

Element requestPayload = (Element) requestSOAPBody.getFirstChild();
QName name = new QName(
requestPayload.getNamespaceURI(),
requestPayload.getLocalName());
return name;
} catch (SAXException e) {
throw new ServletException("Unable to parse request. Are you sure it is a SOAP Envelope?", e);
}
}

And finally the doPost method which looks up the file and writes the contents to the response stream. Note that due to the use of getResourceAsStream, the XML file is expected to be in the classpath in the same package as the servlet. Also, the response content must be set to text/xml for a SOAP response.


public void doPost(HttpServletRequest request,
HttpServletResponse response)
throws IOException, ServletException
{
QName requestPayLoadQualifiedName =
this.getRequestPayLoadQualifiedName(request);
InputStream responseXML = this.getClass()
.getResourceAsStream(
requestPayLoadQualifiedName.getLocalPart() + ".xml");

if (responseXML == null) {
throw new ServletException("Unable to find "
+ requestPayLoadQualifiedName.getLocalPart()
+ ".xml in the classpath.");
}
response.setContentType("text/xml");
int respInt = responseXML.read();
while (respInt != -1) {
response.getOutputStream().write(respInt);
respInt = responseXML.read();
}
}

An example of where this could be used is with a Flex front end where the SWF file is host in a web app. The WAR file could be organised as follows:
/MyApplication.SWF (the Flex front end)
/MyApplication.html (wrapper html which embeds the SWF file)
/MyService.wsdl (WSDL defining the interface for the web service. Has servlet address as endpoint)
/WEB-INF/web.xml (registers the SimpleSOAP class as a servlet)
/WEB-INF/classes/SimpleSOAP.class (the servlet)
/WEB-INF/classes/myOperation.xml (the canned response for a call to ‘myOperation’)

The UI development team can now orchestrate their screens, making web service calls to web services that haven’t been implemented yet. Once they are implemented, the soap address in the WSDL can change. The above servlet could be further worked on with some XPATH expresssions to map certain combination of parameters to responses. One could make it really sophisticated to follow a sequence like a demonstration script. However, putting all that together takes the pressure of the server side team in delivering the real implementation, doesn’t it?

Simple ESB Solution
Another approach would be to use the Enterprise Service Bus and have routing rules to read responses from files. The ESB approach would also allow for more content based routing allowing for different responses to be given depending on parameters passed at runtime. If the development environment is going to involve an Enterprise Service Bus, then the above servlet approach is best limited to individual developers environment or for simple automated component testing of the UI.