In-Depth

E-Commerce, E-Business, E-Everything: Applications, Problems, and (Hopefully) Solutions for an E-Commerce World

The business landscape has, and will continue to, undergo drastic change. The single most dramatic element that has contributed to this change is the rapid introduction of new information technology. Creative business models will continue to emerge that take advantage of what the technology has to offer. Several new and radical business perspectives are pervasive, and they can (and will) affect everything.

Why It’s Better for Startup Businesses to Have Too Little Capital, Rather Than Too Much. Historically, lots of capital would imply that inventory would be amply stocked on opening day. But having a large inventory needlessly ties up capital in current IT environments. With an aggressive e-commerce approach, orders to suppliers can be generated and delivered electronically, thus minimizing the need to stock lots of product. Suppliers who can quickly respond to orders will replace suppliers that cannot.

How Studying Customers -- Rather Than Your Competition -- Gives You a Competitive Edge. Studying the methods and practices of one’s competition sometimes can provide insight into their strengths and weaknesses. The keyword is sometimes -- there’s never a guarantee that a weakness could be found and exploited. With Web technologies (e.g., cookies), an organization can study and remember a customer’s likes and dislikes, and use that information to improve their product and service offerings. Streamlining the buying process, especially on the Web, will make a positive impact on customers.

Why It Can Be Life-Threatening to Your Organization to Pursue Too Many Good Ideas, or to Grow Too Fast. Because Web technologies are relatively new, it’s easy to fall into a trap of simultaneously exploring too many different technologies for possible exploitation. Doing one thing really well will have a greater positive impact on customers than barely being able to do many things. Customers remember positive experiences and will return for repeat business, while negative experiences send them to the competition.

Why Your People Pose a Greater Threat to the Health of Your Business than Your Competition Does. Valued trusted employees -- that’s what all organizations want. Over time, these employees will enjoy different measures of success. But, success is a lousy teacher -- it gives one a false sense of, "I can do no wrong." Ingenuity and creativity are not naturally fostered in an environment that has enjoyed moderate degrees of success. People naturally want to repeat what they’ve done in the past to, once again, attain success. Following past successes or industry trends can be devastating to an organization. Consider 8-track audio tapes, Beta VCR tapes or vacuum tube televisions as examples of good technologies gone bad. Today’s emerging technologies demand creative thought to figure out how to best apply those technologies in order to improve products and services to customers.

How Integrating Your Business Virtually Can Make the Difference Between Being Quick and Being Dead. It all comes down to speed … a potential customer visiting a Web site and finding his/her needs met easily and quickly makes all the difference. Not finding the product or service, or not being able to connect to the Web site will cause the customer to leave your Web site and visit the competition. All aspects of business processes need to be tied in with this philosophy. And, while this "tying together business processes" philosophy may sound great, the not-so-obvious truth is that the underlying hardware, system software and applications must be sized correctly, and tuned effectively to provide efficient operation.

All of the above maxims ring true as computing enters the next millennium. The challenge to CPE and CP is twofold: First, to understand how to use the technology to find solutions to classic problems, and second, to model and effectively manage the new applications that emerge.

The Past: Measurement, Analysis, Reporting and the Performance Database

Historically, the philosophy behind measurement has been simple: If you can capture it, then you have to save it to a file. Our industry has been rather anal in this philosophy, resulting in an incredible glut of performance metrics. Unfortunately, not much thought went into answering the question of, "What can we do with this metric?" In the e-commerce world, with Web servers being born every day, the luxury of thinking about what to analyze to determine future capacity needs just isn’t there. Analysis routines are classically housed in statistical techniques such as regression (for workload trending), or cluster analysis (for workload characterization). Reporting has long consisted of classic tabular reports consisting of many columns of boring data. Many of us remember having job responsibilities that involved pouring over these reports looking for anomalies. And, these reports often come from data stored in a performance database, which grows in size each day, presenting us with the problem of devising an effective archiving scheme.

Shouldn’t the future address these long-standing problems? Data dictionaries go a long way in helping to understand the properties of specific metrics. But, perhaps, we should apply statistical analyses to the stored metrics to identify those metrics that actually contribute to user performance achieved? If meaningful metrics were identified, along with their relationships to other meaningful metrics, we could likely cut down on the number of metrics we store in the performance database. Ideally, we could eliminate useless reports.

In addition, analyses should identify critical thresholds for specific metrics, and more importantly, combinations of thresholds among multiple metrics that indicate exceptions. With systems becoming more complex, the focus on reporting must change first to an alert orientation, that is followed with a drill-down capability to indicate why the alert was issued. We need to rapidly find out what the underlying evidence is that corresponds to a problem, and (ideally) what should be done to correct the condition. We’ll discuss this again in the section entitled "The Future."

The Present: Web Capacity Planning and Heterogeneous Solutions

E-commerce and Web-based applications are coming into their own. Web sites now offer online catalogs for customers to browse, while service-oriented organizations are building information kiosks that can direct visitors to the services and information they seek quickly. Government agencies are embracing the Web, as well, offering information previously hard to find anywhere. Bureaucracy, the single greatest problem that bogs down companies and government agencies, is finally being attacked and minimized, thanks to the focus being brought to understand and build streamlined procedures for implementation on the Web.

But within this exciting new e-world, the capacity planner is faced with an old problem posed a new way. Consider the following "dinosaur" scenario: A transaction involves using CICS to access data from a DB2 database. CICS is up, but the DB2 database is down. Is the application considered available? Performance reports addressing individual subsystems will show that CICS was available. But, since DB2 was down, the transaction couldn’t run.

Consider a common e-commerce application from a retail-type Web site: Someone places an order via the Web for a complete PC package: monitor, printer, system unit, etc. Perhaps, the monitor comes from one supplier, the printer from another, and the other components come from, yet, a third supplier. Once the transaction is entered, orders are placed electronically to the appropriate suppliers. What if the Web site of one of the selected suppliers’ is down, or just very slow? Is the application considered available?

We have all visited Web sites that are just too slow. How many times are you willing to go back and visit them? The key to success in the new e-information age is not only attracting visitors to a Web site, but making them want to come back and visit again and again. So while your organization’s Web site might be up, function-rich and performing well, its dependence on a poorly performing Web site can be disastrous to you.

E-commerce and similar Web-based applications involving external links to "alien" Web sites are highly dependent on the availability and performance of those other Web sites -- often not under the control of the originating application. Expect emerging e-commerce applications to adopt strategies involving automatically changing a selected supplier (i.e., Web site) if a timeout without response is detected after some preset amount of time. Additionally, expect suppliers who often exhibit poor performance and low availability to be relegated to low priority positions on an organizations’ list of suppliers. And all of this will involve capturing response time information regarding linked Web sites, storing these times, analyzing them and ultimately automating the policy decision regarding using this supplier in the future.

One more thought: Modeling tools are not sophisticated yet with respect to modeling Web sites. If your organization has a popular Web site, and hit counts are increasing rapidly, the capacity planner is faced with the age old problem of determining what will be needed to process the increased workload effectively while providing acceptable performance. The availability and performance issued just discussed must be included in any modeling exercise, as well. Expect modeling tools to begin to focus on these questions.

The Future: Applications That Will Exploit New Technologies

The future is often difficult to predict. Yet, technologies exist today that would appear to be ready to change many aspects of how business will be conducted.

Data Mining. Analytical software enables you to shift human resources from rote data collection to value-added customer service and support where the human touch makes a profound difference. Intelligent software will continuously scan sales data, tracking trends and notice what products are selling and which are not. No longer will sales associates plow through fat paper reports to find out if sales are going well. If sales were proceeding as desired, there would be no human intervention. Only exception reports will be generated. The implementation and performance of this type of support software will be paramount to the success of an organization.

Clearly, there is a natural resistance to giving up any decision-making function and letting a machine do it. But, managing systems that are generating performance data exponentially is impossible to do manually. With the increasing number of systems and the complexity that this brings, we’re simply incapable of recognizing patterns in large amounts of data.

Using data mining techniques, we can find useful patterns in large amounts of data. Online analytical processing (OLAP) was the first target for data mining. Data originally collected for accounting and bookkeeping purposes was recognized as a potential mine of information for modeling, prediction and decision support. This is very similar to what happened in the performance arena: Measurements designed for accounting purposes and collected on the mainframe were subsequently used for early performance analyses. Companies began creating corporate data stores, or data warehouses to satisfy new demands for business analysis. Sophisticated data mining tools can navigate through an information rich environment without requiring users to be experts in statistics, data analysis or databases.

The not-so-distant future in CPE and CP is bright for the application of data mining. Historical data abounds within an abundance of metrics. The challenge practitioners face is the formulation of key questions. Data mining should be able to integrate sales-type figures with performance metrics to provide new insights into business relationships. We (hopefully) should be able to define real business transactions and Natural Forecasting Units (NFUs) that would allow queries such as, "How much more memory do I need if my sales of red widgets increases by 20 percent?"

Training and Distance Learning. Training should be available at the employee’s desk, as well as in the classroom. All training resources should be online, including systems to provide feedback on the training. We’ve all heard about the use of correspondence courses by professionals who wanted to be trained in a new area. Originally, a correspondence course was conducted by mail, with assignments being received by a student, worked on at their own pace and returned via mail when completed. This educational model was never really as good as the traditional classroom, but it was effective.

With current multimedia and Web technologies, the notion of a correspondence course has been redefined -- another facet of e-commerce. Distance Learning now allows geographically dispersed students to interact with each other as well as an instructor. Multimedia lectures are being created that can be viewed at any time by a student. Static Web pages are being used to describe assignments, to identify required reading and to supplement an audio or video lecture. listservs and newsgroups are being used to allow students to respond to questions asynchronously that were posed by an instructor. Everyone in a class can both see the question and the answer. Although asynchronous, these technologies allow for interaction in a virtual classroom.

There are some advantages to this mode of learning, too. How many times have you sat in a "real" class with a question in your head, but feel a bit intimidated about speaking up? In an asynchronous mode, all eyes will not be on you, and some of that intimidation you feel should be gone. In addition, we expect chat rooms to be set up for distance learning classes where all students in a class are expected to connect at the same time and interact in realtime. Chat rooms would supply a synchronous mode of learning, similar to the traditional ‘live’ classroom experience.

Tools, such as Microsoft Camcorder (packaged as part of Windows 98) that capture mouse movements and clicks, as well as keystrokes, have been available for some time. They produce .avi files that can be played back on a variety of platforms. Thus, we would expect training in the use of specific tools to use this technology. Such tools also allow an audio track to be recorded, thus annotating what the user is seeing on the screen. We’ve already seen this within Microsoft’s popular Powerpoint presentation package, which allows voice annotations to be added to each slide.

What does distance learning imply with respect to performance and capacity? Expect dedicated training servers to emerge, as audio and video files can be unruly with respect to size. And as they are potentially very large, capacity planners will likely have to concentrate on sizing network bandwidth correctly. This may, in fact, become the key issue in developing a true distance learning-based training environment.

Reduced Time for Data Collection. Portable devices and wireless networks extend your information systems into the factory, warehouse and other areas. Small portable devices are emerging that make data gathering easy. Both Saturn and Boeing have implemented trouble-ticket reporting using handheld PCs that have resulted in dramatic reductions in turnaround time for problems. Wireless inventory reporting systems allow for improved accuracy and cut data collection time in half. Ultimately, these systems will proliferate throughout organizations, and must be considered within any significant capacity planning exercise. In effect, the workload that these devices present is dramatically different in quantity than traditional key-in transactions, and the effective transaction arrival rate will be different as well.

Artificial Intelligence and Pro-Active Alert Policies. Many IT organizations make extensive use of automated event detection systems to discover potentially harmful software, hardware and environmental events. But, along with the use of event management is an exposure to alarm showers -- a situation that occurs when a large number of events are triggered at the same time and the management interface becomes overwhelmed. Administrators can no longer ascertain where the most severe problems reside, let alone their root cause.

Component-based applications are built using applets, Java, or ActiveX and are assembled dynamically at runtime through a browser interface. As the number of these applications increases, new events will need to be tracked, such as component collisions, lack of space for required components, missing components, component version dependencies, expired components and component corruption. An IT organization already overwhelmed with alarm showers will likely be buried by the introduction of component-based applications. It is not humanly possible to monitor and manage all the events, or state changes, in a distributed enterprise of component-based applications.

To address this problem, the IT staff must manage at a higher level. Management tools must be able to dynamically adapt and learn about new applications and technology introduced into the environment. Rather than manually building management rules for each new application, IT needs a management tool that can automatically learn about new applications, filter out the superfluous information and infer when the application is shifting from a desired state. Furthermore, it would be advantageous if the management tool could proactively predict when there might be a change in the desired state of an application.

The only hope in this arena is the application of Artificial Intelligence (AI) to performance. AI Agents would accumulate a large set of statistics from the system being monitored. Via real-time analyses, the agent would learn about which system states are deemed "good" and "bad." Once a significant number of observations have been accumulated, an AI agent would be able to heuristically determine whether the system has departed from its desired state. If so, then a management interface (e.g., console, pager, e-mail, etc.) could be alerted.

This approach has multiple benefits including vastly reduced network traffic, minimized CPU overhead on the systems being managed, and the ability to predict when a system is about to transition from a "good" state. Expect IT organizations to efficiently manage multiple component-based, enterprise applications using this type of approach.

The ability to predict when there will be state changes has many potential benefits. This technology can be used to ascertain when a job stream, for example a backup, is unlikely to complete in the desired timeframe because of anticipated network traffic. The job stream could be automatically modified to start the backup one hour earlier. AI analysis can provide the ability to learn from experience -- experience vastly more complex than humans can assimilate. This type of analysis will make recommendations on changing existing management policies to improve the overall service provided.

Potentially, "what if" scenarios could be posed and subsequently simulated to determine the potential effects of configuration changes and technology introductions without committing to suffering their consequences.

Summary and Final Thoughts

"I don’t see information technology as a stand-alone system. I see it as a great facilitator. And maybe most important, it’s a reason to keep asking yourself the question: ‘Why, why, why?’" -- Paul O’Neill, Chairman and CEO of Alcoa.

The simple truth is that information technology enables reengineering. And advances in information technology will force organizations that want to compete in the e-commerce world to ask O’Neill’s "Why, why, why?" question. Capacity planners and performance professionals have and will continue to be confronted with the challenge of understanding the impact that new technology will have on an organization, and how it does business. Perhaps, that is the one single constant that permeates through CPE and CP. The objective here was to try to enlighten you to some of the issues coming down the superhighway.

Finally, consider some of the following business lessons as we approach the next millennium … it is likely that they, too, will impact performance:

• A lousy process will consume 10 times as many hours as the work itself requires. A good process will eliminate the wasted time; technology will speed up the remaining real work. So, in the e-commerce world, it’s more important than ever before to work smarter.

• A CEO must regard IT as a strategic resource to help the organization generate revenue and/or remain viable.

• The CIO must be an integral part of developing the business strategy, and must be able to articulate in plain language what IT can do to help that strategy.

• PCs and connectivity make new educational and training approaches, like distance learning, possible.

• Training costs should be treated as part of an organizations’ basic infrastructure costs, especially in the era of the rapid introduction of new technologies. A critical component of competitive advantage is having personnel that are trained to take maximum advantage from the new technologies.

• In business, as well as the government, he who has the shortest procurement and deployment cycle wins.

About the Author: Bernie Domanski, Ph.D., is a professor of Computer Science at the Staten Island campus of the City University of New York (CUNY). He is the author of over 50 papers, and is CIO of the Computer Measurement Group (CMG).

Must Read Articles