In-Depth
Simulation and Modeling: Beyond Fear and Loathing
Fifteen to twenty years ago, simulation was over-hyped. Today it's time to investigate the wide range of cost-effective, working solutions for business design needs.
For all their undeniable power, information systems are a double-edged sword for managers. While they greatly enhance performance and control of all sorts of organizational activities, they also radically ratchet up the level of unpredictability and risk. As the detail and dynamic complexity of systems grows, so does the necessity of shifting from implicit to explicit information modeling.
In theory, this transition should be a no-brainer. Discrete event simulation is a powerful tool that can generate value in many contexts. Formal, scaleable models of information systems can cope with huge amounts of data and are easily shared by different work groups. Plus simulations encourage learning throughout every stage of the IS life cycle.
In practice, though, for many companies the transformation is fraught with fear, if not outright loathing.
Part of the answer is that simulation was so over-hyped about 15-20 years ago when the technology wasn’t really ready for prime time. Another reason is that many business decision makers are instinctively suspicious of black boxes, often with good cause. As the meltdown of Long Term Capital Management demonstrated, you don’t want to bet the house on simulations you don’t understand.
While procrastination may have been understandable a few years ago, the good news now is that there are a wide range of cost-effective solutions for all sorts of business design needs. And now they are ready for prime time. Everyone in the IS world is familiar with computer aided software engineering (CASE) tools such as unified modeling language, but these tools are hardly the only models worth looking at.
Prominent among new solutions are early stage tools designed to organize and elicit requirements. These tools focus on capturing information about the business process the IS application will support. Using a drag-and-drop interface with various icons, the tools document the timing, sequence, duration, and performers of activities, along with their inputs and outputs. Costs are associated with activities, inputs, outputs, and performers. The more rigorous process simulators insist on a specific syntax for describing a process to ensure that the simulation results will make sense. Once a model is reasonably complete, the process simulator can produce data about activity cost, throughput, bottlenecks, and logic flaws.
One product in this category, Holosofx, has recently become prominent, largely due to its recent acquisition by IBM. The Holosofx Workbench includes support for XML and exports model data to UML modelers such as Rational Rose. A very different approach to process simulation is employed by Telelogic in its Tau SDL Suite. Employing a draft version of UML 2.0 and some of its own extensions, Telelogic supports the modeling and simulation in C or C++ code of a set of activities. For real-time processes (where Telelogic specializes), formal modeling and simulation are a must to achieve the goals of the application because of the high requirements for managing queues and avoiding bottlenecks.
Also worth investigating are a new generation of network and infrastructure modeling and project management information systems that contain their own unique modeling capacities for tracking and reporting operating performance. For complex, large-scale projects, particularly those with demanding service levels or performance-based contracts, these tools are becoming increasingly indispensable. They help prepare models of message flows across networks and through software layers, and are especially useful for hardware capacity planning and budgeting or for system performance optimization. The simulator pushes the expected workload of the application through the network and infrastructure to spotlight under- and over-utilized components such as hardware servers, database engines, and wide area networks.
HyPerformix of Austin, Texas, for example, offers the Integrated Performance Suite. It includes performance analysis, optimization, and pre-built models of common infrastructure software such as IBM’s WebSphere, SAP, and Oracle. One large project to overhaul the core applications of a Fortune 50 financial services company uses HyPerformix models to predict performance of hardware and software infrastructure under different workload scenarios.
Another emerging class of tools called project management information systems aims to simulate the project, not the business process or infrastructure. According to a recent Standish Group report, about 50 percent of all IS projects are “challenged” and another 15 percent are outright failures. The root cause of many of these troubled projects is unrealistic assumptions about project scope and duration. The communications overhead of large projects, in particular, tends to be overlooked, and delays in producing deliverables result.
Vite’s SimVision product helps project managers make more realistic forecasts of deliverable dates and costs. In SimVision, the modeler creates a project simulation using schedule information from project planning tools such as Microsoft Project. Drawing on historical information about communications costs and templates for different project roles and interactions, the software illustrates where a project is vulnerable to delay or cost over-runs. For large software projects (perhaps fifty or more people), SimVision mitigates risk impressively.
There are four key factors to weigh when looking for a simulation solution:
- An intelligible model. Simulators can product a lot of data, but, if people can’t use the data, they’re not much use. The model should be capable of telling a story without any words.
- Mentoring and support from the vendor or another source. Building the first model is by far the hardest, and the vendors can help you get started quickly. If people don’t have confidence in the model, they won’t use it to make decisions.
- Interfaces with other software tools such as spreadsheets, software-modelers, and project planners. The data in the model and simulation are much less useful if you can’t get them out.
- Your organization’s readiness for a given type of model and simulation. Often people need evidence of the accuracy of a model before they will commit to using it; complex models may describe the situation accurately but baffle decision makers.
One of the enduring benefits of a competent simulation is that it enables us to evaluate several possible futures using the same assumptions at low marginal cost once the first model is built. Expensive, “bet-the-business” decisions invite debate, much of which is fueled by our different assumptions about the future. An appropriate model allows us to vary our assumptions about the future and ask what-if, making it easier to build commitment to a chancy decision by examining the various risks the decision-makers are concerned about.
Another payoff is reduced conflict. A model doesn’t have feelings, which means people can challenge the model, instead of each other. Often an incomplete or somewhat flawed model is a great starting point for a group contemplating a complex decision. In fixing a raw model people learn about each other’s interests through the neutral “container” and begin, perhaps, to engage in dialogue, not debate.
About the Author
Phil Leggiere is a professional business and technology journalist who specializes in information technology, enterprise software, and business management. Co-author Con Kenney is currently chief IT Enterprise Architect at the Federal Aviation Administration.