In-Depth
Accelerating BI Adoption in Smaller Enterprises
Business intelligence in small and midsize enterprise shops is an anecdotally tough nut to crack. A new report claims to offer up a nutcracker of sorts.
- By Stephen Swoyer
- 07/26/2011
Business intelligence (BI) in small and midsize enterprises (SMEs) is an anecdotally tough nut to crack. A new Checklist report from TDWI aims to help start cracking it.
The lament is familiar: SMEs just don’t have the budget dollars, resources, or (a function of both budget dollars and resources) expertise of their bigger enterprise kith. Nevertheless, BI in SMEs has a long history.
Even before the emergence of software-as-a-service (SaaS), open source software, and other unconventional software licensing or delivery models, BI vendors frequently targeted SME or mid-market customers, arguing that the SMEs were poorly served by existing offerings.
A common concern, for large enterprises and SMEs alike, is that one doesn’t just do business intelligence: one doesn’t buy or license BI software, deploy it to one’s users, and simply start using it. According to David Loshin, president of BI consultancy Knowledge Integrity Inc., and author of TDWI’s “Checklist Report: Fundamentals of Business Intelligence for the Small and Midsize Enterprise,” successful BI efforts start with “corporate introspection.”
That’s the first item in Loshin’s checklist. “Introspection” in this case not only involves reviewing key performance objectives -- to say nothing of how these objectives are to be achieved -- but identifying key business processes that must be improved. “Improvements are demonstrated through measures and metrics, and therefore it is necessary to define performance measures, success objectives, and information requirements for each identified business process. Put a framework in place to conduct a baseline measurement and continuous monitoring of these key performance indicators,” Loshin writes.
“Once business processes are identified, you need to know the key players, the roles they play, and how additional knowledge provided through a BI framework can benefit the organization,” he continues. “These stakeholders, business users, and customers become the primary consumers of BI, and their needs must be solicited to create your initial set of functional and information requirements for reporting, analysis, delivery, and presentation.”
Introspection is the first step. Even if it’s an obvious one, that doesn’t make it any easier. Take it from one veteran of the mid-market scene.
“Our model is [that we go into a customer account and] spend one day talking with the business [users] and identifying their needs, their requirements, installing [our software] and getting it working and spend the next four days proving it to them,” says David Macey, CEO of SwiftKnowledge Inc., a BI start-up that innovates on top of Microsoft Corp.’s BI stack. Macey says that SwiftKnowledge’s target industry -- mid-market banking or financial institutions -- is a relatively greenfield segment as far as BI is concerned.
One upshot of this is that SwiftKnowledge’s mid-market customers don’t always approach BI deployments the same way their large enterprise counterparts might. “What we find a lot [of times] is that they [i.e., business users] don’t have a real clear understanding of their requirements ... of how they relate to their [company’s] objectives. It’s ‘I would like to be able to do this,’ or ‘I need this,’ or ‘Why can’t I have this?’” he explains. “So we tell them, ‘Everyone can have their own dashboard.’ We say, ‘Think of the other dashboards [that are exposed via a user’s dashboard view] as kind of the corporate dashboard. You [the rank-and-file user] can have the stuff that matters to you in your own [dashboard], too.’”
The second item on Loshin’s list is to build a data warehouse-based infrastructure for reporting and analysis. Although it’s certainly possible to use a regular production database, he concedes, it isn’t advisable.
“An organization’s existing transaction environment may be able to tolerate a few ad hoc report requests, but as the appetite for actionable knowledge increases, there is a risk that more ad hoc queries will slow down or interfere with existing production systems,” Loshin writes, citing data accessibility, data usability, and data quality issues.
“[T]here is a good approach to addressing these issues: provide a separate data and processing environment,” he continues, adding that “unlike transactional or operational systems, data warehouses are engineered to support data cleansing, consolidation, various data architectures for aggregating and summarizing results, and BI applications for reporting and analysis.”
Third, Loshin counsels, shops should standardize upon how they represent and integrate data. This, too, could pose something of a learning curve, if only because data integration -- or, more precisely, the kind of data integration (DI) that’s associated with the big information integration-class platforms marketed by IBM Corp. and Informatica Corp. -- is typically viewed as an esoteric and costly proposition. It’s a crucial step, however.
“Standardizing the definitions, data element representations, table structures, and overall data model used for the core business information concepts (such as customer or product) captured within the data warehouse will reduce the negative impacts of variation and inspire increased trust in the resulting reports and analyses,” Loshin explains.
Besides, integration doesn’t have to be costly or -- depending on one’s scope and needs -- esoteric. Informatica, for example, markets a platform-as-a-service (PaaS) DI offering; Microsoft and Oracle Corp. bundle DI tools with their database offerings; Talend offers a free community license version of its OSS DI tool; and upstarts like Expressor Software Inc. and WhereScape Inc. come at DI from a nuts-and-bolts perspective.
“Complexity should not be an issue,” said Michael Whitehead, CEO of WhereScape Inc., during an interview at this year’s TDWI World Conference in Las Vegas. For the same reason, Whitehead argued, you “shouldn’t have to pay hundreds of dollars an hour” for a consultant with expertise in the use of DI platform tools, either.
Thanks to another trend, Loshin’s fourth checklist item -- i.e., establishing a plan for ensuring quality data – BI doesn’t have to be prohibitively difficult, either.
DataFlux, IBM, Informatica, and SAP AG -- among other players -- market best-of-breed data quality (DQ) technology, but a host of other vendors -- including Information Builders Inc. (IBI), Microsoft Corp., Oracle Corp., and OSS DI player Talend -- offer data quality capabilities as part of larger offerings.
“The mid-market is a big growth area for us,” said DataFlux CEO Tony Fisher, interviewed by BI This Week during TDWI’s World Conference in February.
DataFlux spent most of 2010 ramping up its services arm to address the DQ needs of its customers. Part of this effort involved an assessment of the DQ needs of mid-market customers. “We think that [i.e., the mid-market] is a good market sector for data quality. We also think the midmarket for MDM [master data management] is kind of underserved. The price point or the entry point -- from a cost–of-services and the human expertise perspective -- has traditionally been viewed as too high. We’re trying to reduce that down so that we can get practical MDM.
PaaS or cloud-oriented DQ services are still incubating -- Informatica, for example, doesn’t yet market a discrete DQ PaaS offering -- but they’re coming, Fisher says. “There are still technology issues that have to be sorted out, and [DataFlux is] just putting the core architecture in place, assuming that once [the industry] figures these out, [customers] will come.”
To read the rest of the recommendations in Loshin’s report, download TDWI’s “Checklist Report: Fundamentals of Business Intelligence for the Small and Midsize Enterprise” here (short registration required for first-time users).