Putting the Pieces Together
Vendors and businesses tackle the “Grand Challenge” of data integration.Be careful what you wish for. If nothing else, overcoming the millennium bug proved one thing, says Lou Agosta, director of research for Giga Information Group (Cambridge, Mass.): legacy systems are definitely here to stay.
Now that IT departments everywhere have finished congratulating themselves for averting a Y2K crisis, some may soon start wishing—secretly—that they hadn't been quite so successful. After all, as any management team that's gone through a massive data integration project can attest, preserving a legacy system may not always seem like the time and money saving solution it is.
Ideally, true data integration would allow a homegrown application to draw from data located on different servers to be accessed seamlessly and transparently from a client machine in order to gain a complete view of a subject being queried. |
"Now that we're on the far side of this Y2K problem, the legacy systems are going to be with us," Agosta says. "It doesn't make any business sense to overturn a legacy system, so you have to make sure they remain efficient."
Some IT managers are now facing this reality with a degree of dread—not because of any hostility toward their old platforms, but because of the difficulties that can arise when a company wants to retain its old system while modernizing its technology, all at the same time. Preserving a company's investment in its computing infrastructure doesn't do any good unless the years of accumulated data stored on those systems is made relevant, accessible and usable. And that, says Agosta, is “a grand challenge.”
"Due to the diversity of databases, due to the diversity of hardware platforms, due to the diversity of software vendors, this is a very, very difficult proposition," says Agosta, the author of “The Essential Guide to Data Warehousing” (Prentice Hall, 1999). "It's a challenge on the order of difficulty of getting a robot to see or getting a computer to play chess. I mean, Chessmaster is a pretty darn good program, but it won't beat Kasparov."
Ideally, true data integration would allow a homegrown application to draw from data located on different servers to be accessed seamlessly and transparently from a client machine in order to gain a complete view of a subject being queried. The potential benefits of making applications that were never designed to communicate with each other share data with each other have led many businesses on a quest to develop the ideal integration solution.
In recent years, there has been a movement to establish a standard for XML/EDI that would enable companies to transport many types of documents among disparate applications within an XML framework. John Capobianco, senior VP and chief marketing officer for Bluestone Software Inc. (Philadelphia) says Bluestone is working to incorporate XML documents into its integration applications because it believes XML is the technology of the future for data management.
Although many experts regard data integration as essential, and even inevitable, they don't dismiss the apprehension many IT managers feel when faced with a data integration project as unfounded. There are serious concerns and complications to consider, as Fuelman Inc. can attest. |
"An XML server, its purpose in life is to correlate information from disparate platforms," Capobianco explains. "The self-defining attributes of an XML document make it much easier to deal with. …XML is the format for data exchange. It is the only one that's going to be acceptable."
Agosta questions this belief that XML should automatically be regarded as the next-generation solution for data integration problems, or whether it’s just a quick fix coupled with a lot of hype.
“XML is not exactly a high-performance tool …but it's a glimmer of hope," Agosta counters.
The debate over the technology’s usefulness is itself still premature, however, because the impact of XML is just beginning to be felt on the Internet, and it will likely be some time before the language evolves to more varied uses. In the meantime, says Brian Reed, VP of product management for middleware vendor Merant (Rockville, Md.), companies have essentially three options for how to get their various applications to talk to each other. Reed says if a company wants to run an integrated operation with access to all the data they will need, they must either establish a data warehouse, develop an Enterprise Application Integration (EAI) strategy, or implement a data integration solution.
Reed explains the distinction:
“The role EAI plays is really one of automating processor workflow ... it’s totally back-office oriented. It has nothing to do with letting end users view the data,” Reed says. “Data warehousing, data mining, they’re really great for historical information, but what you miss in a data warehouse is the real-time information. If I’m going to visit a customer, I may be able to get all the information about the customer’s history with my company, but say they call the day before and say they’ve had a major development. If I’m warehousing every 24 hours, there’s no way I’m going to see that.”
"There are 101 methods, and all of them are imperfect, and all of them are useful," Giga’s Agosta says. He adds, however, that some companies that want to avoid the process altogether may try to convert their data on an as-needed basis to allow it to run on other platforms, rather than integrating it so that it runs transparently. This is an ineffective long-term strategy, he believes. "The conversion itself is such an expensive process, an integration project for the longer-term really is inevitable."
Following a merger that brought 500 new users onto its system, Stockholm, Sweden-based insurance company Länsförsäkringar turned to Mynta Management (Stockholm, Sweden) consultants in search of a way to ease the training process for new employees. The company had been using the Life/400 insurance and pension administration system for 10 years, and although they were happy with the system, they felt the green-screen user interface was “complex, with a steep learning curve,” according to Fredrik Klinteback, senior systems architect for Mynta.
Klinteback’s team decided that a Web interface, distributed across the company’s intranet, would be a suitable solution for making the administration system more intuitive and easier for new employees to get accustomed to. But because Länsförsäkringar ran its intranet through Windows NT and Microsoft’s Internet Information Server, the first priority became finding a middleware solution to communicate the Life/400 data between the AS/400 and NT servers. Klinteback settled on ActiveObjects/400 from SystemObjects Corp. (West Chester, Pa.) because of its quick response time, easy implementation and transparency to the user. First implemented in August 1999, the system is now in its third release, and, says Klinteback, “The customer has never seen such happy and satisfied users.”
The entire project took eight months to implement, three of which were almost entirely consumed by Y2K activities. By “eating the elephant in small parts,” rolling out small pieces to users as they were implemented, and creating a detailed plan for handling errors and changes, Mynta was able to put together an effective solution well within budget and time constraints. Integrating the two systems was viewed as absolutely crucial, Klinteback says, so Länsförsäkringar never considered abandoning its AS/400 and running all of the applications on the Windows NT server.
“The Life/400 system is too complex to port to another platform, and no alternative product exists,” he explains. “Even if they did exist, the task of converting all existing insurances to a new system would probably be several factors more expensive.”
Tobin Gilman, director of product marketing at Hyperion, (Sunnyvale, Calif.) believes demand for his company’s Application Link product remains strong precisely because more and more companies are realizing the necessity of linking apparently incompatible systems.
“These kinds of products are essential,” he says. “So many companies today are made up of different divisions or are a product of mergers, acquisitions, takeovers. You never find a company that is running all of its software, all of its infrastructure from the same vendor. It just doesn’t happen that way in the real world. There’s no vendor that provides all of the software a company needs.”
Although many experts regard data integration as essential, and even inevitable, they don’t dismiss the apprehension many IT managers feel when faced with a data integration project as unfounded. There are serious concerns and complications to consider, as Eric Hymel of Fuelman Inc. can attest.
Fuelman (Covington, La.) sells software and services to small oil companies. Its AS/400 software issues gas cards and processes payments, and performs a number of related functions. Following the acquisition of competitor Gascard in 1995, the company installed a Windows NT server to help control clients’ costs by providing an authorization tool as each purchase is processed on the AS/400 system. Fuelman adopted StarQuest Software’s (Berkeley, Calif.) Data Replicator (SQDR), to allow the two servers to communicate in real time as each purchase was submitted, processed and authorized. Soon after the system was implemented, however, Hymel says it became clear that it could not handle the required volume of transactions. In fact, the results were so problematic, Hymel is now in the midst of overhauling his entire system.
“We had memory leaks, random system reboots—just nightmare scenarios for an infrastructure manager,” he recalls.
According to Hymel, the company knew from the beginning that SQDR did not offer every feature Fuelman required. Their capacity demands were high, the application had no data conflict resolution functions, and it did not allow users to edit rows of data. Many features “had to be built from scratch.” Hymel says the problems Fuelman experienced might have been unavoidable, because in the course of his search for a solution, he has come to believe the right technology is simply not yet available.
“The problem of bi-directional large high-volume data replication is difficult,” he explains. “…We chose SQDR because we felt like its competition couldn’t even offer what it offered. We guess we spent about 3,000 hours to implement this, and even if there had been software that did exactly what we wanted, it might have taken 1,000 hours.”
Giga’s Agosta agrees that data integration technology has had trouble keeping pace with new developments such as quick-stream and other data formats. While he believes the process is essential, he cautions that this disparity will continue to make data integration a difficult prospect for businesses. "In the 1980s the message was, manage your data as a corporate resource. Fast-forward 20 years, and the message is manage your data as a corporate resource. So as you see, we haven’t come very far. And part of the problem is we’re chasing a moving target.”