In-Depth
Information Foraging and the Changing Face of BI
Even if users are getting poor results out of their existing business intelligence (BI) tool, why might they prefer to stay put, and how does the next wave of BI interfaces propose to change things?
- By Stephen Swoyer
- 06/15/2011
If you want to understand business intelligence (BI) usage patterns, says industry veteran Donald Farmer, QlikView product advocate with QlikTech Inc., you'd do well to look at the food-foraging behaviors of monkeys.
Farmer isn't conflating BI consumers with monkeys, although some more cynical folk haven't shied away from doing just that. Instead he's referring to "information foraging," a concept that was first developed by researchers Peter Pirolli and Stuart Card of the Palo Alto Research Center (PARC) Inc. in the late 1990s.
Pirolli and Card's work is based on the idea of "optimal foraging theory," an ecological concept that describes how organisms (mostly primates) organize and maximize their foraging -- or food-seeking -- activities.
Information foraging theory, as developed by Pirolli and Card, describes how humans organize and maximize their information-seeking activities.
That's where monkeys enter the picture, says Farmer. "I have a monkey in a tree. It's eating fruit. At some point, the tree starts to yield less fruit, or the fruit isn't as easy to get to -- it's far out on the limbs, it's higher up in the tree. The monkey has to do more and more work to get the fruit out of that tree. At what point does the monkey decide to quit that one tree for another tree?"
The point, Farmer says, is that users, like monkeys, tend to stick with what they know -- up to a certain point. If there's abundant and low-hanging fruit in an adjacent tree, and if (moreover) it's relatively easy to get at, the monkey in Farmer's example will be more inclined to make the switch. The same isn't necessarily true of BI consumers, however: if BI users are getting good results from current tools, they'll be less amenable to making a switch.
Actually, BI users don't even have to be getting good results, Farmer says: in this kind of context, predictably mediocre results are still predictably mediocre results; making the switch, on the other hand, is fraught with risk and uncertainty.
First, BI users have to master the usage paradigm of another tool. That's a scary thought for many. Secondly, they don't even know what kinds of results to expect. Sure, the new tool could yield better insights or results than do an existing tool -- but how much better will they be?
On the other hand, couldn't they be worse? Haven't they been promised an easier user experience -- to say nothing of better results -- at different times in the past?
Yes and yes, but the past is the past -- and (more importantly) the paradigms of the past are giving way to new paradigms. The BI results of today aren't packaged, presented, or consumed the same way they were a decade ago. Interactive has replaced static. Mobile devices are complementing desktops. Information is getting closer to real time. Consumers would like to interact and collaborate in ways that just weren't imaginable 10 years ago.
"It's more of a scatter-gather [paradigm]," says Farmer. "You start from one place, you spread out, and then you come back to that place and regroup and go out again." People have been talking about the Google-fication of search for a decade now, he continues, but what's perhaps most interesting about Google isn't the idea of search technology but that of the Web browsing experience, particularly in the context of search. "Think of what happens when you do a search in Google. You click on a link and you start searching [i.e., foraging] for information. At what point do you give up on that link and find another one?"
It's a different paradigm, Farmer insists, and it requires a different approach.
Ask him about "BI," for example, and he'll shift the discussion to something QlikTech calls "Business Discovery." Farmer isn't just quibbling over semantics, either. The difference, he insists, is experience.
"Our exploratory experience in QlikView actually works the same [as the Google search experience]. People have this sort of associative view, so what we allow them to do is to go back and forward between the different queries that they've issued, or ... we have a current selection view which shows them the state of all of the selections that they've made," he explains.
This is a good start. It's an approach other BI players are pursuing, too. Farmer says social networking -- which adds a richly collaborative dimension to the vanilla Web browsing experience -- might provide the best context in which to forage for information. "We've taken this idea of information foraging and applied that to business discovery, and that's worked really well. Now let's apply a similar ... approach [involving what Farmer calls "deep interaction"] to how ... we communicate these discoveries and decisions to other users."
In this case, he says, the idea is to promote meaningful business collaboration -- with an inescapably social dimension -- between and among users. "Look at things like the metadata tagging, or the ability to comment on the documents that people have. This is [one case] where [social networking] makes sense."
Some vendors, such as LyzaSoft, were far out in front of the social networking wave; LyzaSoft's Lyza Commons, released 16 months ago, achieves a collaborative experience not unlike that of Facebook.
Others, including QlikView and IBM Corp. (which introduced a social collaboration component with its Cognos 10 release late last year), are easing themselves -- and their customers -- into BI-oriented social networking.
"Social networking grows gradually," Farmer concludes, conceding that LyzaSoft, which was starting almost from scratch, delivers a "neat" product. Vendors that have large installed bases have to approach things more iteratively, he argues.
"If you try and do it all at once, people aren't going to pick up on it. It might even turn people off. Google Wave was a do-everything social networking tool. That [didn't do so well]. Social networking doesn't grow that way."