Tuesday, June 27, 2006

Non-Invasive Application Integration - Not your Father's screen scraping

Let's talk about integrating with a Legacy application via its user interface.

If you can access the data underneath a legacy application, nine times out of ten it is going to be faster than getting it from the screen. As no one is going to advocate one application updating a legacy database concurrently with another application sometimes the user interface is the only way to go.

There are good reasons; the legacy application provides all the validation and marshalling of data into the appropriate (usually) non-relational tables. And there are those occasions where the legacy application is presenting some calculated value, the result of a piece of business logic that is entombed inside a tight ball of dreaded spaghetti code.

Vendors of legacy integration tools wince at the term "screen scraping" and this term appears to emote fear and loathing among IT professionals looking for a way to glue their legacy application together with a shiny new contemporary application.

I think I know how this came about.

In the 80's most products that I shall refer to as "first generation" legacy integration tools, were effectively glorified terminal emulators. These tools relied on a state-full connections with the host system over indirect connections (networks) and direct connections (serial communications and multiplexers).

A number of factors combined to create painful experiences for the IT pros called upon to implement solutions with these tools.

First, it was early days - slow communications and lack of experience with this emerging market gave consumers a collection of flakey tools to choose from.

Second, legacy applications weren't so "legacy" back then. They were living and breathing applications being actively maintained by the thick eye-glasses and pocket protector brigade (yeah, that's right, the same guys who got paid mega-bucks in the late 90's to make their own damn code Y2K compliant) which meant that those X-Y screen maps that you spent hours creating were rendered useless the very second that a maintenance release hit the production environment.

So what has changed between then and now? On the first point, Darwin's theory of natural selection. Some vendors went bust, a few changed direction or went back pushing regular terminal emulation. But a few vendors remained in the market, learned from their early mistakes and those of others and a second generation of "re-facing" products emerged delivering smarter development tools and strategies for change control to mitigate the work of pocket protector brigade. Also, the legacy applications became more “legacy” as some of the guys with thick glasses moved to Florida and there were less of them around to mess with the display.

Nowadays we are on the third generation. Web-to-host, thin clients, service interfaces, integration servers, stateless connections, pooled host sessions and code generators no less. Sounds cool, huh? Well yeah it is. I especially get a kick out of the code generation. Programs begetting programs, it's all a bit Arthur C. Clarke.

However, I think the biggest innovation comes to us in the form of the alternative routes into the legacy application, especially mainframe applications. 3270 and 5250 are nice protocols to work with. IBM did a good job when it designed them as data streams. They were light years ahead of character mode protocols such as DEC VT and the myriad of variants of the day. The concept of delegating the work of interacting with a user to a client device thus enabling the server to handle much higher rates of concurrency is the same approach used by that there new fangled electric inter-web.

But protocols are usually stacked on top of protocols and each layer becomes less friendly yet more efficient the deeper we go until it becomes positively primitive. Sneaking in underneath 3270 we find 3270 Bridge, FEPI (Front End Programming Interface) and deeper still COMMAREA. The ability to plug in at any point in the stack is a powerful one yielding large gains in throughput as we strip away unnecessary fluff.

Of all the legacy applications out there those hosted by mainframes present the most options for robust non-invasive programmatic integration, especially CICS applications.

The iSeries with TN5250 comes in a close second, given a sub-system that has been tuned to recognize that robot users resemble batch jobs versus the traditional 12 second key/think time normally allowed for interactive jobs.

Following up the rear are the character mode applications. Traditionally hosted on UNIX systems and not usually written to any UI standard. Every application is entirely different from all the others. Integration with those applications that are based on some kind of framework or were generated by a 4GL tend to be easier to work with, in that screen A behaves pretty much the same as screen B.

Integration tools that perform well with character mode applications come from vendors who understand that it is not about managing screens as it is managing the state of a screen and having smart algorithms for figuring out when the application has finished painting. Input Inhibited Flag? Pah! Luxuary!

I have seen plenty of vendors who do well with page mode protocols (3270 & 5250) but truly suck at character mode protocols and vice versa. Only a few vendors do well at both and offer the ability to drill into the stack of supporting protocols where available.

The best performers are those tools that can pool host sessions and park them for re-use. Serving 100 users with 10 interactive host sessions is an efficient use of resources.

The need for non-invasive programmatic integration with legacy applications is going to be with us for quite some time. In the past year two of my clients purchased brand new mainframes from IBM while at the same time developing new enterprise class applications (one went with .Net, the other with J2EE). While many discuss their company’s “mainframe replacement” strategy, few fully appreciate the time and planning required to make this a reality. Good quality application integration can offer a roadmap towards this goal by enabling an enterprise to control the pace of change while still responding to the requirements of the business, which is why we all have jobs in the first place.

A wise man once told me that any application currently in production is "legacy". Already vendors are offering tools to expose services from legacy Win32 applications as well as tools for 'clipping' content from legacy web applications. The wheel keeps turning and many of the techniques used currently for non-invasive programmatic integration can be applied to contemporary applications.

I hope that we learn the lessons of the past and embrace concepts such as Service Oriented Architecture while striving for interoperability with standards such as web services.

Leveraging standards is the key to successful legacy integration. So when today's sexy enterprise applications become tomorrow’s dinosaurs our children will either praise us or flame us based upon our adoption of well defined standards.

So, as I grow older I remember what has gone before and wonder what is to come. Stronger eye glasses for sure and maybe an urge to carry several colors of ball point pen with me "just in case". But I do know this….there will always be a legacy application that needs liberating.


James
June 2006
Minneapolis

Friday, June 23, 2006

Making Legacy Applications "Open"

Service-Oriented Architecture promises agile, open information technology solutions that will enable businesses to better serve the customer, to reduce costs and to be more competitive. SOA is not something that can be ordered for next day delivery. It is an ideology that can get lost in the rush to meet deadlines.

There are so many ways in which to expose core business functions that are locked away in legacy applications that there is almost no excuse for not doing it. Trouble is that there seems to be a general perception that programatic integration with legacy applications is un-reliable. In my experience this is not the case.

There are many vendors in this market with many years of experiencing in providing tools for just this job. The biggest challenge for an IT department these days is choosing the right tool from the right vendor. Proof of Concept, dude! There is no such thing as a bad POC; except for a vendor of course :-)

I'll be upfront about the purpose of this web log. It is a place to share ideas and dicuss technical details. It is not a place to promote one vendor over another. This is a big deal, I am a Solution Architect for a vendor of Legacy Application Integration tools.

There are a lot of misconceptions about integrating with Legacy applications, and I am hoping that this will be a place to dispell many of them.

James
June 2006
Atlanta