The More Things Change ...
I recently was invited to speak before a regional DECUS conference that was organized in part by ENT
columnist Greg Scott. The subject I was asked to address is also the title of this editorial.
As I started to think about the parallels between the computing model of today and the computing model of yesterday, I realized there are some really odd things happening.
But first a relevant -- and true -- story. It is about a young friend of mine who happens to be quite knowledgeable about Windows NT. Being in his early 20s, and having ridden the Windows NT wave from its very beginning, he finds it something of an oddity that organizations really do still use old computers, especially machines that were discontinued back when he was an infant.
Recently he went back to college to start working on an advanced degree. In need of some extra spending money, he looked into a potential part-time job on campus. The job was to swing by the computer center during the night to swap out backup tapes on the university’s PDP-11. The old Digital system still controls the building locks and all the environmental settings for every building on campus.
To say he was stunned to find such an "outdated" machine in use would be an understatement. I could hear it in his voice when he told me about his surprise. I think he simply couldn’t understand how that old machine could possibly be useful, what with its "antique" operating system and a CPU with less horsepower than an outdated PC. A great application to convert to NT, isn’t it? Or is it?
Don’t suggest that to the people who attended this regional DECUS event. The irony is the computing model that that old PDP-11 still uses, the same model that a VMS, AS/400, mainframe and any Unix system uses, is more relevant than many people care to admit.
The computer industry and the user community have put themselves through a great deal of sweat and duress to follow some of the trends that have come and gone in the past 15 years. I like to sketch the swings as a sinusoidal wave that bounces from one extreme to another.
Back around 1980 or so, about the only computing model used was the host-centric model. One big computer, known as a "server" in today’s terminology, was shared by all of the users who were connected using terminals, better known as "thin clients" in modern terminology.
Then PCs arrived. Although IT managers are widely credited with pretending these newfangled machines had not invaded their turf, the PCs had come, and over time these machines were eventually integrated into the corporate enterprise. Initially, this integration pretty much was limited to one of two key modes. The one choice was fat clients running applications that drove against data resources on the "big computer." The alternate approach was to run an emulation package, in effect turning the PCs back into terminals.
One thing you don’t see much any more is that famous Gartner Group chart that depicted the various models of client/server computing. You know, the chart with the diagonal line through it that broke the presentation, application and database elements across the client and the server.
By the early to mid-1990s, many companies had in fact figured out how to create a distributed application that could actually use some of the processing power that resided on the PC side of that picture.
Then the Web revolution occurred.
Today we have software vendors scrambling to sell users on "brand-new" concepts, yet most of these concepts are actually old technology models being reinvented in the NT space: things such as Intelligent input/output cards (the I2O initiative), clustering (Cluster Server), and even multiuser computing from a single platform, better known today in NT land as Windows Terminal Server.
Perhaps the most interesting concept, and one that should get the award for the best modernization of old technology, is the concept of an "application server."
With an application server and a bunch of thin clients (take your pick: PCs running a browser, thin clients, Windows-based terminals, or whatever), you have a host-based computing model again. IBM Corp. and Oracle Corp. have vocally proclaimed application servers as the next great solution, and word is that Microsoft Corp. plans to formally announce its answer to the application server model.
While this new application server model won’t do anything to modernize that PDP-11 that is managing campus operations at my friend’s university, it does bring the computing model back full circle. From that perspective, it would seem that the more things change, the more they really do stay the same.