Realities of Post-2000 Application Development
While the Year 2000 transition represents a huge symbolic change for many industries and businesses, for application development it represents a milestone in a rapidly changing world. The most-talked-about millennium issue--the millennium bug--is rooted deeply in traditional software development disciplines, processes and tools.
ISoftware engineering was an oxymoron pre-2000 but will be very much a reality post-2000. Precisely because software development standards were not rigorous enough in the past, we've spent the last several years diverting billions of dollars of investment into what is essentially bug fixing. So now we reflect, but find the mood of the market has changed considerably.
Application development has progressed through 3GL, 4GL and CASE (Computer Aided Software Engineering)...it's all well storied on the AS/400. RPG, CSG, SYNON, ENVY, AD/CYCLE...who could forget?
With CASE, productivity was supposed to move up yet another notch by having business analysts describe business processes. They would use sophisticated process modeling tools and repositories to store data entities and their relationships to business processes. These would then integrate with application generators. CASE also implied something else--quality and reuse. The former is a much-desired attribute of software that had often been missing, and the latter is a much-desired attribute of the development process, targeted toward repeatability, consistency, quality and productivity.
This gave OO (Object Orientation) its life. And now, driven by the rush to e-business, Java and an entire new breed of OO languages and tools are "in." It seems to me that the constants in all this are; (1) the "old" methods and tools don't go away very quickly, if at all; and (2) we're always searching for the "Holy Grail," the one, single best approach and tool.
If we decompress application development into its two parts, we get; application and development. (That was a stroke of genius wasn't it?) Applications are what Information Technology is all about. Applications provide the business value for IT. They are the automation of business processes, the reduction in costs, the improvements in service or the strategic business initiatives that make us more competitive. And applications are moving to e-business. Who can deny this?
Development is process, and process, to be efficient and effective, requires methodology. Process or methodology gives us a structured and systematic way of achieving our objective (that of delivering business value using IT, otherwise known as applications). Methodology can tell us what will be done (activities or tasks), how it will be done (phases or methods), who does it (roles), the standards to be used, what will be produced (deliverables) and even how it will all be controlled (project management).
We still need to talk about the future. In the best case, future is prediction and projection based on accumulated knowledge and wisdom, and in the worst case, based on crystal ball gazing, otherwise called guessing. In all cases, it represents opinion. So what does the future hold? Software engineering and software integration.
Old methods and tools will continue to go away. COBOL and RPG3 are still alive and processing in many existing applications and cohabiting well with DBMS engines in ways that make them more efficient and productive. Post-2000, this old code will become more and more isolated as business transitions to e-business. New development with CASE is alive and well and residing in some way in most of today's ERP solutions for valid client/server development reasons. These solutions and their future extensions will demand continued use of these AD approaches. There is no business case for changing approaches.
There isn't a single best approach and tool and there isn't one best AD solution. Java is the right foundation for developing highly portable, device independent, browser-based applications. Java is best for building reusable components that can plug into common frameworks such as IBM's San Francisco. But the language and the tool itself are no longer the objective. They really don't matter. The future objective is reusability and component-based software development.
Business Intelligence is driving us to use sophisticated OLAP analysis and data mining tools to develop sophisticated intelligence gathering applications. E-Business is driving us to use Java because future applications must be completely detached from their execution hardware and operating system. What attributes will the next major wave of applications have and consequently what will they require of AD methods and tools? Most likely, more powerful Web-based development technologies such as XML and more powerful middleware tools such as MQSeries will allow us to continue to take advantage of our legacy applications.
Finally, I'm convinced that the recent Y2K experience has taught us at least one very important lesson that has to do with the value of process and methodology. Y2K has only been a symptom of a more important IT trend: the increasingly high-risk profile of our projects. As IT proliferates, projects involving concurrent changes to thousands of programs, databases and hardware devices with high costs (and high risk of litigation) are becoming the norm, no longer the exception. Many systems are complex hybrids that involve multiple levels of hardware, software and network components and databases. Y2K is not the only "mass-update project." Consider the implementation of ERP solutions (the E means enterprisewide impacts!), e-business, the euro and the IT consequences of corporate mergers and acquisitions, which are happening at a frightening pace. All of these projects are high risk. All of these projects emphasize the need for more robust AD methodologies. The lesson of Y2K is that integration is no longer something done only by outside consultants. Integration is IT's core competence.