Raising a Toast to the New Millennium

While the Y2K problem loomed, many industry observers and pundits wondered whether our systems would make it into the next millennium.

Now, as the 2000 mark passes, new and exciting developments are waiting in the wings to turn computing on its head. The first years of the new millennium will commence with a flood of pent-up IT initiatives that were held back by Y2K work; many still encompass e-business and new ways of leveraging information.

Y2K Advances IT

Some companies took a one-time Y2K hit on their finances and resources, but for others Y2K fueled a management revolution.

"Now, more than ever, it's clear that management understands the value of IT," says Howard Rubin, research consultant at Cap Gemini (www.capgemini.com). "Business and IT are inseparable. They understand its potential for tackling strategic business objectives. Business managers are assuming hands-on responsibility for making sure that their Year 2000 project and their post-Year 2000 investments pay off and for making sure that information technology achieves their company's strategic business goals and is well-aligned to those goals."

Y2K provided the groundwork for other initiatives, such as e-commerce and supply chain management, to develop. "Year 2000 has boosted understanding of the mechanics of business-to-business relationships," says Leeland Freeman, vice president of strategic relationships at Source Recovery Co. L.L.C. (www.sourcerecovery.com), and a Year 2000 consultant.

Many new IT projects were joined with Y2K initiatives. Some companies invested heavily in enterprise systems and other new technology. Others developed methodologies and management styles that will help them implement new technologies. Still others achieved new levels of cooperation and understanding with business partners and customers.

Monsanto Corp. (www.monsanto.com) is using its Y2K knowledge to expand its customer relationship management (CRM) and enterprise resource planning (ERP) implementations. "We're trying to use information technology to help us innovate and deliver new products and new ideas to the marketplace much faster and in a much more broad manner than previously," says John Ogens, director of Monsanto's Year 2000 program. "We're leveraging information that we've gathered in Year 2000 work. We've got a much better understanding of our existing application portfolio, our existing infrastructure portfolio."

Consolidated Freightways' (www.cfwy.com) Year 2000 department spanned all departments to analyze and implement solutions. The company's Year 2000 team not only rewrote or retired more than 18 million lines of code on CF's mainframes and desktop PCs, but also completely replaced older legacy systems with state-of-the-art Y2K-compliant systems.

The bottom line is that after Year 2000, IT organizations may never be the same. "As a direct result of Y2K, there has been a rise in standardization, good quality practices, and independent verification and validation," says Leon Kappelman, Ph.D., associate professor at the University of North Texas and co-chair of the Society for Information Management's Year 2000 Working Group.

Windows NT/2000 to Benefit

Y2K forced many companies to cast away outdated code and systems. As a result, only a handful of major operating systems will march into the new millennium: Microsoft Corp.'s Windows NT/2000, IBM Corp.'s OS/400 and MVS, and three variants of Unix: Sun Microsystems Inc.'s Solaris, IBM's AIX, and Hewlett-Packard Co.'s HP-UX. Many vendors are being forced out of the platform manufacturing business into niche or service opportunities, new businesses, or component integrators, says Tom Bittman, senior analyst at GartnerGroup Inc. (www.gartner.com).

"The Unix world is in shambles," he adds. "Of the 12 Unix vendors receiving R&D investment today, only five will continue to receive investment by 2000, and perhaps only three by 2003." Windows NT/2000 will be the key engine for growth within the worldwide server market over the next five years. Windows will expand its share of the worldwide server market from 13.8 percent in 1998 to 30 percent in 2003 -- a 25 percent compound annual growth rate, according to a study by International Data Corp. (IDC, www.idc.com). "NT's phenomenal projected growth rate will put pressure on every aspect of the server market," says Vernon Turner, vice president at IDC.

The expected release of Intel's Itanium IA-64 chip will drive the trend toward fewer versions of Unix. "Intel IA-64 will become the predominant server architecture by 2003, forcing non-Intel architectures to compete for diminished market share," Bittman says. "Only Sun's SPARC and IBM's PowerPC processors will pose significant competition to Intel." In addition, it is difficult for ISVs to continue to support multiple Unix variants, he adds.

IDC predicts that Unix servers as a group will retain their leadership position as the largest operating system platform. By 2003, they are expected to capture $37 billion in end-user spending, or 41 percent of the total server market. But a new wild card, Linux, has to be accounted for. Through 2003, total Linux commercial shipments will grow faster than the total shipments of all other operating systems, IDC predicts. More application vendors -- such as SAP -- are porting offerings to Linux; hardware vendors -- such as IBM -- are continuing to expand the product lines running Linux for server-side endeavors.

Internet2: The Sequel

Universities and government agencies are now interconnecting with networks that are 100 to 1,000 times faster end-to-end than today's Internet. Both the Next Generation Internet -- used by government agencies -- and Internet2 -- used in academic environments -- are research and development programs. The initiatives include new advanced infrastructure development -- for example, networks that can perform at much greater levels than today's commercial Internet -- advanced applications development, and research into technologies that will enable advances in infrastructure and applications.

It's only a matter of time before these networks are available for commercial applications, says Rich Hall, director at IBM's International Center for Advanced Internet Research (ICAIR). "The original Internet grew out of academic research," he says. "If the cycle follows itself again, this next-generation Internet will do the exact same thing, but will come out in a speed that's significantly faster than the first one."

Internet2 universities connect to the National Science Foundation's very high performance Backbone Network Service (vBNS), which is supported by the nationwide Qwest Communications Int'l Inc.'s (www.qwest.com) fiber optic network and technologies provided by Cisco Systems Inc. and Nortel Networks (www.nortel.com).

Current applications delivered over the network include virtual laboratories, digital libraries, and distance learning. Ultimately, the next-generation Internet will "do a fair amount of load balancing and placing content and services farther out on the network, closer to the user," Hall says.

The Collapse of Software Pricing Models

Increasingly, processing power is a becoming a utility that can be accessed anywhere and at any time, as long as there is a connection. "If you buy a new microwave, you just go home and plug it into the wall of your house," says Nick Bowen, director of servers at IBM Research. "You don't have to call the electric company to ask for an extra 200 watts of power." As computing requirements change, "We'd like to be able to give [customers] the parts for which that solution will run. Not make them have to buy a low-end server, or deal with databases and e-mail accounts," Bowen says.

This kind of service is already evident in the rise of Internet service providers (ISPs) and application service providers (ASPs). As a result, companies will buy less and less software, and instead will rent the majority of their applications. Forrester Research (www.forrester.com) predicts that application outsourcing will exceed $21 billion by 2002, and expects the market for rental applications will account for about $6 billion of this market. "ASPs will fundamentally change the way enterprises conduct business," says Greg Blatnik, vice president at Zona Research Inc. (www.zonaresearch.com). "The infrastructure is already in place for ASPs to facilitate the birth of the virtual enterprise. Companies choosing to outsource applications will gain greater access to a broader range of applications."

Rental applications are an extension of the outsourcing business, says Jeetu Patel, vice president of research at Doculabs Inc. (www.doculabs.com). Outsourcing has spread beyond operational or clerical functions into areas such as Web infrastructure, he notes. "The next step is to farm out even more sophisticated applications that may be approaching commoditization, such as e- mail, scheduling, accounting, HR, payroll, and even e-commerce."

Once lingering security and performance concerns are addressed, the concept may dramatically alter the way companies buy and use software. Instead of being purchased as an entire package, some applications will be rented on an as-needed, one-time basis. The required application is downloaded from an ASP or ISP, which holds a master license, and used for a few hours, days, or weeks. The user is then billed by the application provider.

The concept was brought into the corporate light by IBM's Lotus Development Corp. in 1997. Lotus promoted ISP delivery of rental applications off Domino Web servers. At the time, Lotus introduced the first rental application it would license to ISPs, a collaborative environment called Instant Teamroom. ERP vendors are now renting applications, as well.

SAP AG (www.sap.com), through its mySAP.com hosting initiative, offers customers the option of accessing SAP R/3 modules through a third-party hosting service. "Now, a customer does not need to buy the hardware, does not need to employ the specialist for system maintenance, and does not need to have all the application skills," says Peter Graf, director of technology marketing at SAP. "All they need is a Web browser to access a system that's provided by a service provider."

Other software vendors are joining the trend. Last year, US West Communications Inc. (www.uswest.com) signed agreements with six major vendors -- including Microsoft, Oracle Corp., and Novell Inc. -- to offer applications on an as-needed basis via its national data network. The software vendors "will be playing both sides of the fence, providing their own hosting services, while selling solutions to ASPs," Doculab's Patel says.

Eventually mission-critical applications may become available on a rental basis, says Steve Brand, general manager of the hosted applications group at Lotus. "Today, it takes a week for IS to start setting up an application," Brand says. "Tomorrow, we'll be able to go to a catalog, pick an application, and have it immediately created."

Thin Clients Grow

The desktop PC of the future will shrink. A prototype unveiled earlier this year by Intel is a compact, pyramid-shaped device that boots up in 20 seconds -- an "instant-on" capability. The box stands 7.5 inches high and about 8 inches wide on each side. Inside is a 500-MHz Pentium III processor, 128 MB of SDRAM, a DVD-ROM drive, and a 6 GB hard disk. In place of internal ISA slots and parallel and serial ports, are four USB ports and two IEEE 1394 fast serial ports for connecting to video capture and other audio-visual devices. Flat panel displays will become standard.

Along with thinning hardware, the client environment is likely to become more slender. Worldwide shipments of thin clients -- or intelligent terminals -- will jump from 369,000 in 1998 to 6 million in 2003, according to IDC. "We believe the future of thin-client computing is positive," Zona's Blatnik agrees. The trend will increase, "driven by more standardized products, increased customer awareness, better management and administration tools, and proliferation of Windows NT-based servers and server-centric computing solutions -- particularly the concept of applications or information that can be purchased on demand."

Blatnik predicts the simultaneous rise of a new class of thin PCs. "Tightly managed Intel architecture-based PCs will be pervasive, rivaling traditional thin-clients in providing low administrative costs."

The E-marketplace

Business-to-business e-commerce is a fast-growing juggernaut. Forrester projects that online trade will grow from $43 billion in 1998 to $1.3 trillion by 2003 -- an annual growth rate of 99 percent.

This market will be transformed by the emergence of electronic marketplaces that facilitate online transactions. Electronic marketplaces are entering supply chains in vertical industries and horizontal business functions, introducing new efficiencies and new ways of selling and purchasing products and services. GartnerGroup's Dataquest estimates e-market makers generated approximately $12 billion in 1998, and this excludes e-marketplaces focused on financial products -- for example, mortgages and bonds securities -- in which volumes exceeded $100 billion.

Online marketplaces are providing access to products from a variety of competing companies, along with objective information. For example, PlasticsNet.com is a marketplace for the plastics industry, BizTravel.com is a source the travel industry, and iShip.com provides information and access to the shipping industry. The rise of these collaborative online communities will fuel the growth of online procurement and selling systems as companies try to get on board with their particular industry site. While many of these communities are financially supported by advertising, revenues are starting to come from a per-transaction fee for online purchases of business partners' products or services.

There is also a growing convergence between buyer-managed catalogs, supplier-managed catalogs, and electronic marketplace-managed catalogs. A hybrid approach to cataloging will assign the most frequently purchased goods -- such as paper stock and paper clips -- to buyer-managed catalogs.

"E-market makers will revolutionize trading relationships and business-to-business e-commerce, diverting a significant percentage of transactions from extranets, EDI, and Web storefronts while creating opportunity for a new breed of e-commerce transactions," says Leah Knight, senior industry analyst at Dataquest. "E-market makers will change the way organizations purchase and sell strategic and nonstrategic products, the way they purchase and deploy software, and the way they evaluate professional services companies, financial institutions, and ISPs."

Informal Information Sharing

Many established companies are finding their market share being sapped by smaller and nimbler competitors with names starting with e- or i- and ending with dot-com. IDC estimates that by 2003, top-tier companies will lose up to $31.5 billion because of inefficiencies resulting from intellectual rework, substandard performance, and inability to find knowledge resources.

To increase innovation, companies are using knowledge management to institutionalize their expertise and customer knowledge. "Knowledge management views work not only as producing things and exchanging information, but also as fostering creativity and innovation," says David Polimeros, manager of the consulting group for the distribution industry at IBM. "Instead of looking at real activity-oriented work environment, knowledge management puts some slack time in that environment, so people have a chance to socialize and talk about what they're doing with their associates, to foster creativity. It's a new way of looking at time."

Monsanto sought to establish an electronic version of serendipitous socializing between employees. The company is sponsoring electronic discussion forums to bring its scientists together to share ideas and brainstorm. "The lifecycle of new products is growing increasingly short, so we need to increase our ability to innovate more quickly and to distribute those products more quickly into the marketplace," Monsanto's Ogens says. "Our value is created through the intellectual capital of the people that work in our company. So we have to emphasize an ability for collaboration, knowledge management, and the ability to really maximize the talent and the understanding of the people that work in the company."

Because of a lack of tools and processes to actively capture, manage, and connect organizational expertise, an estimated 3.2 percent of corporate knowledge is incorrect or becomes obsolete every year. Another 4.5 percent of knowledge is unavailable due to employee turnover, information mismanagement, or knowledge hoarding.

The Web provides an abundant amount of information to organizations, but "pure access to information is not enough," says Len Schulwitz, director of marketing for the information management initiative at Intel Corp. "You need to get the right information, so you can assimilate it in, and turn it into something actionable." The Web will also hurt companies because the amount of information on the Internet will increase about 40 times over the next few years. "The information onslaught of what you're dealing with now is 5 percent of what you'll see within four years. Because of the delivery mechanism of the Internet, we need tools that will help us sort through that onslaught," Schulwitz says.

Technology that can facilitate knowledge management is expected to appear on the market within the next two years.

This technology will include data warehouses and data marts, collaborative communications technologies such as groupware, and business intelligence tools, IBM's Polimeros says. Collaborative groupware, such as Lotus Domino/Notes or Microsoft Exchange/Outlook facilitate the flow of ideas electronically. At the desktop level, Intel is developing a solution -- to be incorporated into partners' packages -- that enables end-users to visually view information from various sources in a 3-D format.

"In the past, when you were using a computer, you were focused in on applications," Schulwitz says. "Now, it's moved to a document focus. Soon, it will be a context focus, based on project and task-based work. You want information to be readily available, and as opposed to having to go out and having to hunt and peck for it."