In-Depth

The Resurgence of Data Integration

Data integration vendors cite growing interest among new and existing customers in mainframe data connectivity solutions.

The industry watchers best-positioned to tell us about mainframe market growth probably aren't the usual suspects -- namely, IBM Corp. or market-watchers such as Gartner Inc. -- but rather those helping customers move data to and from mainframe environments.

The vendors of data integration, data synchronization, and changed-data-capture (CDC) in years past were tapped to move data once and irrevocably away from mainframe platforms. In many cases, companies such as Informatica Corp. and GoldenGate Software (on the data replication front), or vendors such as Evolutionary Technologies International Inc. (ETI, an ETL and changed-data-capture specialist) and SyncSort (best known for its high-speed, bulk ETL capabilities) say they're seeing more interest in mainframe data connectivity solutions from both existing and new customers.

Users are interested in moving data back to the mainframe, perhaps to take advantage of next-generation initiatives such as zLinux, or, increasingly, Big Blue's zSeries Integrated Information Processor (zIIP). However, most customers seem to have made their peace with the mainframe: they aren't trying to permanently offload applications or workloads; instead, these vendors say, they're maintaining -- and frequently growing -- the status quo, while making mainframe-resident data available to distributed systems via data replication or CDC technologies.

"We don't really see as many migrations off the mainframe or back to the mainframe," says Sami Akbay, vice-president of marketing and product management with GoldenGate Software. "We see maybe doing other migrations [off a platform] -- things like OpenVMS to HP-UX, or even Tru64 [Unix] to AIX, but not really with the mainframe." Big Iron's most attractive feature, according to Akbay, is its predictability: It's fast, it's reliable, and it works, says Akbay. "The mainframe just works. People love it.”

Although zIIP has been successful, especially as a means to highlight the importance of processing data on the mainframe (i.e., a large percentage of mission-critical data is already on the mainframe, so why not process it there?) as well as to demonstrate the affordability of shifting apostate data processing workloads back to the mainframe, it's not altogether clear that it's achieved this goal. New data processing workloads don't seem to be coming back to the mainframe so much as existing workloads -- no longer under siege by the distributed threat from without -- are staying put.

That's a perspective endorsed by Akbay and other data integration vendors. "No one wants to touch [remove] the mainframe, but when I have to use it for reporting, it's too expensive in terms of the number of MIPS I have to consume, so they tend to offload certain parts of the information to secondary systems running on AIX or Linux." From there, Akbay says, they do their BI, reporting, or query and analysis processing.

"At the core, there's certainly a faithful allegiance to the mainframe because it does work, and it works well, so our customers, when they come to us, are mainly looking to share that data [with distributed systems] -- not to migrate it [off] once and for all."

Don Tirsell, senior director of product marketing with Informatica, agrees. "We've seen a steady stream of migration projects for awhile, ever since the Striva acquisition days," he told Enterprise Strategies last week.

Prior to its acquisition by Informatica, Striva specialized in mainframe connectivity; Informatica has since rebranded the former Striva product as PowerExchange. (That product condenses a number of connectivity offerings, including Big-Iron access. Customers can license PowerExchange and pay for the connectivity adapters that address their needs.)

Last week, Informatica announced a new Data Migration Suite -- essentially packaged software combined with a migration methodology and services -- that serves as a mainframe migration solution. Tirsell and Informatica aren't just pulling data off Big Iron, however.

"There's another area where we're helping people move from legacy apps on the mainframe to z/Linux-based applications, so this suite will help facilitate this process," he explains. From Tirsell's perspective, most shops that planned to move away from Big Iron have already done so. "We've seen the number in the 100 to 500 MIPS range -- most of those migrations are pretty much done, unless you're really a legacy-to-legacy customer, where you've got an old, old mainframe that you're just not going to touch," he observes. "Most of the migrations [away from the mainframe] were more prevalent two or three years ago. Since then, I think the curve has really flattened out."

Wyatt Ciesielka, vice-president of North American sales with ETI, cites a similar experience. His company's data integration platform, ETI Solution V6, supports almost any conceivable platform -- it generates C, Java, COBOL, and a variety of other languages (basically, a compiled executable of data) from a source system that can run on virtually any target platform -- and which can be hosted natively on the mainframe, too. (Code generation isn't as much of a throwback as you'd think, either. ETI's approach is very popular with government agencies, especially with the Department of Defense, as well as large banking and financial institutions, because of its inherent security.)

Ciesielka and other ETI officials we spoke to say their mainframe connectivity business isn't just holding steady. In fact, according to Ciesielka, ETI is optimistic that its mainframe connectivity business will continue to grow. ETI recently sponsored a whitepaper -- Back to the Future: Maximizing the Use of Your Mainframe Legacy Resources -- to highlight its own capabilities in this respect.

ETI has a very good story to tell in this respect, Ciesielka points out: its integration products (which includes a CDC capability, too) can shift data bi-directionally within z/OS, between z/OS and other open systems (e.g., Windows, Linux, Unix) -- and also between z/OS and other so-called "legacy" platforms (e.g., MVS, VSE, OS/400 and VMS).

The salient point, Ciesielka says, is that Big Iron customers are staying on the mainframe. In some cases, he confirms, ETI is seeing more interest in shifting data (apostate or otherwise) back to the mainframe.

That's a point echoed by Patricia Rickly, manager of marketing operations for SyncSort, a product that started out on the mainframe three decades ago. SyncSort -- which has since grown beyond its mainframe roots -- also markets a data conversion/transformation and integration solution for open systems, dubbed DMExpress, that boasts a feature called FilePort, originally designed to facilitate one-way movement from mainframe environments to distributed systems. Thanks in part to the resurgence in Big-Iron’s popularity, SyncSort's DMExpress now goes the other way, too, according to Rickly.

"When we originally introduced the FilePort utility, it was designed to take things from the mainframe and synchronize them with Unix systems, so we only had [synchronization] in one direction; but our customers asked to be able to [synchronize] back and forth, because they wanted to be able to do the bulk processing on the mainframe," she indicates.

However, Rickly stops short of emphasizing a resurgence in information processing on the mainframe. "Most of our customers are doing their legacy applications. We don't see that many new applications, where they're moving data back, but we really have not seen that much movement off the mainframe, either."

The takeaway, says GoldenGate's Akbay, is that while new data processing workloads aren't necessarily moving back to the mainframe (not en masse, anyway) as a result of zIIP or other IBM efforts, mainframe-based applications such as CICS and IMS aren't going anywhere, either.

The data they're generating, of course, is another matter: organizations are increasingly exposing that data -- via replication, synchronization, or CDC tools -- to a teeming application ecosystem. "Trying to replicate the data to open systems seems to be a more common trend than trying to replace these old systems. Old is not necessarily bad. These things are well-tested, well-proven, they work like a charm, and no one wants to put their reputation on the line and take that risk," he says.

Must Read Articles