In-Depth

Why IBM Wants to Put Mainframes at the Center of BI

IBM asks the not-so-obvious question: If you already have a mainframe in-house, why not make it the centerpiece of your business intelligence and data warehouse efforts?

For a couple of years now, IBM Corp. has tried to push its mainframe line as a platform for business intelligence (BI) and analytics. This effort got off to a muted start six years ago last month, when Big Blue introduced its zSeries Integrated Information Processor (zIIP). The zIIP was designed to give mainframe shops a less expensive way to run database- or data-processing-specific workloads on System z. (These workloads would run in the context of the zIIP, and not on the general purpose, or GP, mainframe processors.)

Over the last five years or so, Big Blue's become more ambitious about mainframe-centric BI. Remember those famous V8 commercials in which a man drinking an unhealthy beverage salps his forehead after realizing could have (and probably should have) opted for a healthier, better alternative: V8 juice.

This is similar to Big Blue's pitch with BI-on-System z: If you already have a mainframe in-house, and if you're continuing to invest in your mainframe assets, why not make Big Iron the centerpiece of your BI and data warehousing (DW) efforts? Or: You, too, could've had a z, mate.

Getting Serious, Getting Clever

Big Blue's bring-BI-back-to-the-mainframe push, like its broader (and more ambitious) Smart Analytics initiative, took on fresh urgency with its acquisitions of the former Cognos Inc. and SPSS Inc. Since then, IBM has tried to drum up support for Cognos BI running on System z, or zEnterprise as it now calls its mainframe line. IBM likewise introduced a dedicated Big Iron-based Smart Analytics offering, its Smart Analytics Optimizer (SAO) for DB2 for z/OS.

Big Blue's acquition of Netezza Inc. bore Big Iron fruit late last year, when IBM introduced its DB2 Analytics Accelerator (DAA) for z/OS version 2.1. The DAA for z/OS 2.1 is based on the massively parallel processing (MPP) analytic database technology IBM acquired from Netezza. Along with the DAA, Big Blue announced a pair of new Smart Analytics offerings for z/OS: its Smart Analytics Systems 9700 and 9710. It positions both offerings as centerpieces of its vision of an "end-to-end" DW and BI infrastructure based on zEnterprise. To that end, the 9700 and 9710 are designed to function as data distribution hubs and as platforms for transactional platforms or operational or enterprise reporting.

There's another wrinkle. In tandem with its zEnterprise mainframe launch in July of 2010, IBM introduced a new mainframe-open systems hybrid: the zEnterprise BladeCenter Extension (zBX). zBX, for folks who don't speak zEnterprise-ese, is a technology "sidecar" that -- when used in conjunction with Big Blue's Unified Resource Manager (URM) -- permits BladeCenter platforms, along with zEnterprise mainframes, to be managed as a single virtualized system.

In other words, shops can manage Windows workloads -- including Windows-based BI and DW workloads -- along with their mainframe, Unix, and Linux workloads. The wrinkle here is that to the extent a shop is willing to pony up the cash for a zBX, any Windows workloads running in the zBX will essentially become the property of the mainframe group. (Just as Linux workloads running in a zBX, or hosted by IBM's low-cost Integrated Facility for Linux -- which runs on the mainframe itself -- likewise become the property of the mainframe group.)

The message, in a sense, is clear: you can centralize all of your workloads on or near the mainframe. You can manage all of your workloads from the mainframe, so why not shift your BI and DW workloads to (or back to) Big Iron?

MPP, BI, and Predictive Analytics on the Mainframe, Too

IBM has become more vocal about the BI aspect of this message over the last six months. For starters, the analytic systems that IBM announced last October help to flesh out its vision of a mainframe-centric BI and DW infrastructure.

Like Oracle Corp., which now markets Exadata as a one-stop shop for analytic and OLTP workloads, IBM is pushing System z as an all-things-to-all-business-consumers offering. As, you know, the original one-stop shop.

Last month, mainframe watcher Alan Radding, who doesn't typically cover BI or DW, published an article in which he considered the mainframe as a platform for predictive analytics. He began by conceding the obvious.

"Over the last several decades people would laugh if you suggested a mainframe for data analysis beyond the standard canned system reporting," wrote Radding, on his Dancing Dinosaur blog. "For ad-hoc [sic] querying, multi-dimensional analysis, and data visualization you needed distributed systems running a variety of specialized GUI tools. In addition, you'd want a small army of business analysts, PhDs, and various quants to handle the heavy lifting."

Putting aside the requisite "small army," Radding nonetheless pronounced the zEnterprise an "ideal vehicle" for analytic workloads.

"It is architected for on-demand processing through pre-installed capacity paid for only when activated and while adding processors, disk, and memory without taking the system offline," he argued. Turning from an assessment of zEnterprise's innate attributes, Radding cited IBM's Netezza-based DB2 Analytics Accelerator for zEnterprise as exemplary of the way in which the mainframe can enrich and accelerate BI and DW.

"When embedded in the zEnterprise, it delivers the same kind of performance for mixed workloads -- operational transaction systems, data warehouse, operational data stores, and consolidated data marts -- but with the z's extremely high availability, security, and recoverability," he argued. "As a natural extension of the zEnterprise, where the data already resides in DB2 and OLTP systems, the z is able to deliver pervasive analytics across the organization while further speeding performance and ease of deployment and administration."

Maybe. That's certainly IBM's message, anyway, and Big Blue isn't just talking, either: as the DAA for z/OS, its Smart Analytics Systems 9700 and 9710, and its efforts around Cognos BI on zEnterprise demonstrate, it's equipping shops to move or return BI and DW applications to Big Iron.

IBM is also evangelizing more about zEnterprise and BI: as Dancing Dinosaur's Radding points out, Big Blue has appointed a dedicated point person (Alan Meyer, formerly of its InfoSphere marketing team) to manage its data warehousing for System z marketing efforts. And at last October's Information on Demand (IOD) conference in Las Vegas, IBM offered no less than 13 tracks on Big Iron BI, including a session comparing DB2 for z/OS and Oracle Exadata as platforms for DW.

The catch, of course, is that BI and data warehousing professionals tend to be cool -- if not downright hostile -- to the idea of the mainframe as a platform. Most are used to thinking of the mainframe as a source or provider platform: as something out of which data or information has to be pried or cajoled, usually by hand-coding. Few are used to thinking of Big Iron as a partner platform, and fewer still -- very, very few, in all likelihood -- are comfortable with the idea of the mainframe at the center of an organization's BI and DW practice.

"I don't see a lot of people doing the work on the [mainframe], but there are a few. My personal view is that people still running mainframes are doing it because the cost [e.g., software, labor, operations] to migrate is not worth it, or because they're Luddites who don't want to learn anything new," said a prominent analyst and former DW professional, in a discussion about mainframe DI a couple of years ago. "I find it hard to justify expanding a platform for which talent is increasingly rare, [the cost of] which is so high, and which is really inflexible and hard to integrate into today's technology landscape."

Questioned last year about mainframe data warehousing, this DW observer was as pessimistic as ever, citing -- for example -- the persistence of hand-coded ETL in mainframe environments. In many cases, this person suggested, folks are still using flat files and FTP transfers (scripted or coded as batch jobs) to get data off their mainframes. Why? Because it costs more to do ETL processing directly on Big Iron, even with IBM's attempts to slash pricing.

Even if IBM slashes the cost of mainframe capacity and lowers the cost of its own software, its ISV competitors -- a group that includes several prominent DI and BI players -- have little or no incentive to follow suit. Many, in fact, price their mainframe products -- e.g., BI or DI applications that run directly on z/OS, connectors or adapters that permit shops to move data to or from mainframe systems -- at a premium that reflects both the traditional pricing of the Big Iron platform and the implied criticality of mainframe applications.

That's one reason why so many shops are still using hand-coding to siphon data in and out of their mainframe environments. "It's the nature of the beast," the data warehouse observer concluded. "Even with mainframe DB2, I still see it."

Must Read Articles