In-Depth

The Reality of Network Management

Businesses today are spending large sums of money – often millions of dollars – building and maintaining the most advanced and efficient information technology (IT) departments. But, when it comes to managing their enterprise network, many are only just becoming aware of a "hidden" cost of ownership: the distribution of new software and upgrades throughout the enterprise.

Many companies are laboring under the premise of computing’s formative years – the 1970s and 1980s – when hardware was the biggest cost factor in technology ownership. However, as we approach the dawn of the new millennium, the cost of software and its many upgrades has far exceeded that of hardware, and distributing software has become a burden to many IT managers and their organizations.

Five years ago, mainframe MIS professionals spent approximately 40 percent of their IT budget on hardware, 12 percent on software and the balance on MIS staff. In the last couple of years, those costs will have reversed, with 40 percent of IT budgets being spent on software products and licensing and 10-12 percent for hardware. In fact, the Meta Group says that for every dollar spent on hardware upgrades, companies spend $7 on software and licenses. And that figure does not even include the cost of distributing software to users.

Over the past 25 years, business computing has evolved from large mainframe computers secluded in back rooms to PCs on every desktop. While businesses could easily control mainframe configuration to maintain standards and ensure reliability, today’s diverse networks of mainframes, minicomputers and PCs are far more difficult to manage. Companies must configure, maintain and update operating systems, productivity applications and the display portion of business software for thousands of computers across the enterprise.

Ironically, most companies today use a time-consuming and cost-prohibitive manual approach to this issue, rather than seeking a technology-based solution. However, some leading-edge organizations like The City of New York; the law firms Skadden, Arps, Meagher & Flom, and Milbank, Tweed, Hadley & McCloy LLP; and the investment firms T. Rowe Price and Donaldson Lufkin and Jenrette, are streamlining software distribution and saving up to 90 percent of that hidden cost.

These organizations have found an intelligent process to streamline enterprise software distribution, which provides businesses with a simple, efficient and cost-effective way to manage and customize PCs without human intervention.

Since selecting a technology-based solution, for example, Milbank, Tweed, a leading financial and corporate law firm based in New York City, is saving a significant amount of money that it does not have to spend on IT staff and overtime. According to the firm’s IT manager, without using intelligent software distribution, Milbank, Tweed would have to dedicate a full-time technician in each of its eight worldwide offices to keep up with software additions and changes.

This article will look at the history of distribution of software, the rise of the enterprise and how some companies successfully manage today’s dynamic business environments.

The History

Over the past 30 years, business computing has evolved through a number of distinct phases. Initially, data processing meant complex number crunching and tabulation for back-office processes, such as corporate accounting. The primitive computer systems used for this work were essentially dedicated to a single task at any given time. Management of the machinery generally amounted to scheduling jobs and performing physical maintenance.

During the late 1960s and early 1970s, the concept of time-sharing evolved, meaning that a single data processing unit could service multiple users and tasks simultaneously. The mainframe computer that performed this work became more complex, as large amounts of data were now stored online and extended to individual offices by the use of terminals. However, a single computer installation with all of its attendant software, security and back-up needs was all that was required.

In the 1980s, the industry introduced the desktop PC, which was initially considered an individual productivity tool. As the decade wore on, system designers created a new technology – networking – that linked PCs and mainframe/minicomputers together and enabled them to communicate.

The Rise of the Enterprise

The early 1990s release of Microsoft’s Windows 3.1 created another key standard. Previously, PCs could share information with each other or act as terminals into mini- and mainframe computers. With Windows, a PC could run larger, more powerful graphical applications that interacted with programs and data stored on mainframes and sophisticated servers.

With the PC’s expanded capability, large enterprises found themselves with a diverse, often confusing network of computers, including mainframes, minicomputers and PCs of various power, all of which were now being connected to one another through enterprise networks.

Unlike the mainframe configurations that were inherently simple to manage, these combination systems are very complicated and require a sophisticated computer on every user’s desk.

Each desktop must have an operating system, productivity applications, such as word processing, and the display portion of the enterprise business software that once ran solely on the mainframe. In addition, each PC must be correctly configured to communicate with one or more servers.

Compounding this problem is the continuing need for large enterprises to modify their software and distribute those modifications throughout their systems on a constant basis. Large companies like banks, insurance companies and brokerage firms that use mainframes, minicomputers and PCs routinely update their software system-wide to meet changing business needs. This is not merely to install the latest revisions of desktop productivity applications such as a word processing program, but to institute changes in the mission-critical, line-of-business software that literally runs their businesses.

It is expensive to manually install, configure and maintain software on the thousands of PCs that are now a crucial business-computing asset in large companies. We estimate that it generally takes a technician two-to-nine hours to do the work necessary to configure one networked PC with new or updated software. The typical technician cost ranges from $60 to over $150 per hour, and in a large enterprise owning hundreds of thousands of computers, this expense becomes substantial.

There is little value added in this process. With the continual drop in hardware prices, the expense to add and reprogram PCs throughout an enterprise generally surpasses the cost of the equipment. The "total cost of ownership" has become a key element that most chief information officers are charged with managing. If given a choice, no sane MIS/IT manager would try to do it by hand. Yet, most still do.

Managing Software Distribution Across the Enterprise

The worldwide use of PCs has brought with it hundreds of millions of people who are not only computer literate but who configure their own PCs to meet their individual needs. Almost all PC users not only have specific ways in which they use their desktops, but become highly protective of the manner in which their desktops are configured. Indeed, it is the individuality of each user that contributes to the creativity and productivity that the computer has created for our society. Yet, whether desktop or laptop, these computers must communicate with enterprise networks in order to operate.

The MIS/IT administrator is left in a difficult position. Even as the enterprise must have the ability to monitor, update and control each workstation, it must also leave in place the individuality of each person’s PC. This paradox complicates the task of desktop management and greatly adds to the cost of configuring the enterprise network environment.

Given the enormity and relatively low added value of the job, many enterprises outsource the installation and maintenance of their networks to an independent systems integration company. There are many thousands of these companies throughout the United States and abroad. These companies install hardware and software, establish the networks and service the systems of governments, as well as private industry. Some are multi-billion dollar per year revenue companies with thousands of consultants. Others are local technical entities with two or three owner/technicians who work on smaller networks.

They all have one thing in common: They generally charge by the hour for their time and calculate the cost of making changes to a network on the basis of the number of PCs and servers that have to be modified. They compete fiercely by establishing their technical competence and then outbidding their competitors with a lower hourly charge for the intended work.

A separate issue is the constant change in the computer world. Today, businesses can no longer assume that a single PC on each employee’s desktop is the answer to every technology question. The move toward the distributed enterprise network and the acceptance of the TCP/IP communications standard – in which every computing device can interact with every other computing device – has fueled a paradigm shift in the way we do business. Employees no longer sit at their desk using the same PC every day. Worldwide sales teams, roving field agents and telecommuters, among others, require not only their own computing devices, but also an efficient way to communicate with the home office.

There are three main ways that this communication happens: Internet browser-based applications, local PCs and portable computing devices.

The browser-based model brings computing back full circle to the mainframe. Information is stored centrally – today on a cluster of NT- or UNIX-based servers instead of the mainframe – and distributed to users. These clustered machines, which may reside in the same location or worldwide, need to be updated regularly with a company’s mission-critical software.

Information stored locally is typically the province of telecommuters, remote sales forces and field personnel, all of whom prefer to maintain their personal productivity applications – calendar, word processing software or Rolodex – on their own PC, but still must connect via the Internet for certain applications and updates. This creates a virtual office environment, where insurance agents, for example, keep key data stored locally to provide customers with price quotes. These agents, however, must connect to their corporate server regularly to submit or receive data.

The third model is portable specialty computing devices like the Palm Pilot, which become offline data terminals but periodically must go online to submit data or receive updates. These devices, which utilize Windows CE, embedded Windows NT, Palm OS and other operating systems, become addressable devices that must be updated regularly. An example is overnight package delivery personnel who take in data about the packages they pick-up and deliver, going online from time-to-time to submit data/orders to the home office.

While they seem to be very different, these three key models have an important similarity: Local PCs, portable computing devices and Web server clusters (also commonly called "server farms") must all receive software updates quickly, efficiently and cost-effectively. The task is complicated by the remote location of the end user devices, and by the need to minimize downtime on Web-servers managing critical e-commerce applications. Without automation, it is well nigh impossible.

Automating the Process

Utilizing software to automatically make changes to every PC on a network – regardless of location anywhere in the world or type of device – is being embraced by leading-edge businesses (See Table 1). These technology-based solutions – which can automatically download software updates to each PC or other device on a complex enterprise network – will have a profound impact on every aspect of the computer business, such as:

• Eliminating the need and cost of technicians spending up to nine hours on each PC making manual changes

• Reducing the time needed to introduce new and upgraded software by a factor of many magnitudes.

• Bringing predictability and reliability to application management.

• Giving IT administrators the ability to control the enterprise network without taking away the individuality of the PC user.

• Radically altering the manner in which system integration companies work and bill. (Economically speaking, they would be priced out of the market if they did not license and use, products because others using these kinds of products would easily underbid the competition).

• Facilitating the continued growth of the virtual office.

Technology-based intelligent software distribution should provide an end-to-end solution for transporting, packaging, customizing and delivering software applications. In contrast, complex enterprise management frameworks – which provide comprehensive network management services – can transport software applications, but require IT professionals to manually trigger transport and rely on third-party tools for packaging and delivery.

Companies seeking a sophisticated software solution should look for the following:

• Robust three-tier client/server repository architecture for the scalability required by large enterprises.

• "Delta" packaging technology to capture applications using "before" and "after" snapshots of a software installation, allowing system administrators to focus on the final result, rather than the setup process.

• Automated installation and management of a fresh copy of the Windows operating system, so the delta image is taken from a clean baseline.

• A method of capturing sequences of keyed commands to help administrators provide an enhanced and flexible customization ability that can’t be completely configured using "delta" or "snapshot" technology.

• Automated packaging and distribution of NT system services.

• Y2K compliance.

• Intelligent uninstall and rollback capabilities, enabling administrators to reverse any distribution after the fact.

• Multi-level security, allowing enterprises to assign various elements of the software packaging, distribution and monitoring process to different individuals and groups within the IT organization.

• A centralized IP-based replication facility which lets enterprises stage application and customization data to a limitless number of locations worldwide from a central point.

• A "Gold CD" capability, allowing administrators to export all or part of a the application library to CD-ROM or other removable media for automated execution on PCs and laptops not connected to the high-speed network.

Conclusion

Today’s world of business computing is changing faster and in ways never before imagined. In order to stay ahead of the competition and keep pace with ever-changing technology, companies need to find ways to simplify burdensome tasks, such as manual updating of multiple PCs and other computing devices across an enterprise. Technology exists to help IT managers and CIOs economize and improve efficiency across the enterprise, but they need to join other leading-edge organizations that have begun to improve their business practices by automating the software distribution process.

About the Author: Dov Goldman is Chairman and Chief Executive Officer of Cognet Corp. (Valhalla, N.Y.; www.cognet.com).

Must Read Articles