In-Depth

BI's Future: Learning How People Think

Former TDWI education director Dave Wells has peered into the future of BI and he sees actual intelligence, not just technology.

Earlier this year, Dave Wells ended a five-and-a-half-year stint as TDWI education director, a position he took after a career in technology and business that he began in 1970. He says that at TDWI he felt bound by his responsibility to keep the curriculum practical for those who relied on it for application back at work. Now he's free to think and dream. I talked to him by phone early last month.

Cuzzillo: You've said that as TDWI education director you felt a responsibility to those who needed practical knowledge. Now that you're free of that duty, what new insight have you come up with?

Wells: I guess one of the things that comes to mind is the realization that BI is more about people than it is about technology. For BI to go to the next level, we really need to understand how people think.

You are aware that I have done a little bit of work around this systems-thinking approach, thinking in terms of cause and effect. There are two other communities out there that are exploring how people think. One is called design thinking, which focuses on the structure of things and the relationships between things. Then there's this critical-thinking community, where the essence of critical thinking is challenge everything, test everything (especially your own assumptions and beliefs).

Give me an idea of how this critical thinking stream will apply here.

One of the real looming questions I think in BI as it becomes mainstream is: what do you do when analytics contradict conventional wisdom? Then it becomes a political and religious debate. If we took critical thinking and proactively streamed it into business intelligence, it would be in such a way that we raise the question, "What are the fundamental beliefs that drive our behaviors in a company?"

Every company has these gut-level beliefs that direct our behaviors. If we can surface what they are, and use BI to challenge them, to test them, to prove them or disprove them, and become proactive about testing our assumptions, [we can avoid] the reactive, "Oh, no! The data must be wrong, Ted!"

That's the kind of thing you'd think would have been done automatically.

I don't think it's happening anywhere. I think we have two communities in any company: those that are running around playing with the numbers, and those that are running on gut instinct. When they collide, fascinating stuff is going to happen.

This isn't applicable tomorrow, but there is life after BI. Someday, BI is going to be subsumed by what is currently known as knowledge management, but it will have acquired another name by the time it has become mainstream. It's on the horizon now, as we see people playing more [with] unstructured data, spatial data, and text analytics.

Do you think BI, the term, is already fading?

Yes. I think that it has to be, and that's the natural cycle of things. I think there is something that will encompass both BI and knowledge management and somebody will come up with a clever label for it. As these two things converge, which they must do, they will find a new label, and BI will go under the hood of this emerging thing, just as data warehousing went under the hood of BI.

You talked about the critical-thinking stream. What about design thinking?

Design thinking is about understanding the components of things and how they relate to one another. It's the one I'm the least well-versed in right now, the one I've just started doing the research on. Design thinking is, in some respects, almost aligned with Eastern philosophy. What is the Chinese term for the placement of things?

Feng shui?

Yes, well that is a design thinking kind of belief. What we have traditionally done when we analyze something is that we decompose it into its parts and study those parts. The flaw in our current approach to analyzing things is best described by an analogy that says, "You cannot understand the impact of Van Gogh's Starry Night by categorizing the brush strokes."

That's where the design thinkers are trying to go. It's "Yes, it is about the parts, but it's about the relationships and the placement of those parts in relation to one another." Maybe the value of real high-impact metrics, for instance, is not in the measuring of things but in the measuring of relationships.

How will the BI world find its way there?

It is inevitable because it is a natural product of evolution. Things that we do and think are independent tend to converge or form new things. This is the natural evolution of human intelligence, society, commerce, and many other things. When I was a child, 11 or 12 years old, my older sister was married to a man who worked his fields with draft horses. The world that he entered into and the world that he left were radically different things. The world that I entered into and the world that I leave will be radically different, and the same is true for my children, and so on.

I am trying to understand some piece of intelligence here. In fact, I challenge all of the conventional definitions of business intelligence. I beat up some big names as not having a clue what real intelligence is. They talk about tools and technology and systems and data, and that's not intelligence. That stuff may enable intelligence in a communal sense, but intelligence is reasoning and discriminating between concepts, understanding, drawing conclusions, planning, predicting, abstracting the same patterns and the same similarities between apparently different things and understanding how they relate, and taking that abstraction and applying it in another circumstance. That's intelligence.

What have some of these big names said to you when you beat them up?

Probably some of the things that I say to myself: "eight miles high," "out in left field," "this is stuff that has no practical application today." But today is no longer interesting; we've got today figured out. You're not going to make a difference by repeating today over and over for the next hundred days.

Would they say, "What about all this dirty data? We still have something technological to fix"?

There will always be something technological to fix. Let's go back to Starry Night. Do I need to fix that brush stroke or do I need to stand back and look at the big picture? There are an awful lot of times where I could go from 96 percent data quality to 98 percent data quality and not make a niggling bit of difference in the amount of insight that you can gain from the data.

So you think the state of most data is probably good enough the way it is?

Tweaking the details probably doesn't make any real significant contribution to the overall value of the insights I can gain by understanding the patterns inherent in data. Maybe I gain more insight by taking and recognizing that it's flawed, accurate to plus or minus 4 percent. So, given that accuracy, does it support what I believe instinctively? Does it contradict what I believe? What does it suggest for the future? What can I learn about the whys of the past? This is one of the problems we have in retrospective analysis: we spend a lot of energy looking at what happened in the past and very little energy understanding why it happened.

That's a little bit like all my history classes in high school, which were memorizing dates and places -- and that lost all the good stories about history. I go back and I read history now, and if I read the right stuff, there are some wonderfully rich stories there that were never told in history classes. Where's the value in being able to quote all the dates? The valuable learning occurs in understanding what happened and why and getting those really good stories, and then extracting those really good stories and relating them with what is going on today, for history really is repeating itself.

Probably so. OK, suppose we say the idea has practical applications. We're only a couple of guys. What are we going to do?

Anything that we do that is going to make a real difference, we don't do all at once. We take one stream of this thing, and we actually try to apply it -- maybe not across an entire BI program but in some program, and see how it works. Drive it toward the communal learning.

Will it all converge in one? Certainly not. If, say, during that same 10-year time span there are people as engaged in systems thinking as they are in [Six Sigma] today, if during that same 10-year time span we can get to the point where there are people as engaged in critical thinking as they are today in ISO 9000, then we have climbed rungs on the evolutionary ladder.

So, what do we do today? We start putting these things in practice, and learning what works and learning what doesn't, and getting to the point where we can talk about best practices.

What about BI vendors? What role do you see for them in driving this forward?

I think the companies that are out there building systems-thinking products and the ones that are building BI products haven't connected with one another yet. Maybe that's an area I should explore. Maybe I should get out there and get some of these vendors hooked up. That's a good thought, [and] an interesting idea.

There would be a new kind of consolidation there, wouldn't there?

Yes, that could be fun.

Must Read Articles