In-Depth

Managing Risks: Testing and Quality Assurance for Year 2000 Projects

As IS organizations progress through their Year 2000 efforts, they quickly encounter quality assurance and testing issues. Leading analysts estimate that verifying the correctness of Year 2000 fixes will account for 45 to 55 percent of the overall project effort. Once they reach the testing portion of their projects, many IS organizations discover they are poorly prepared to conduct the necessary tests, often lacking the expertise and infrastructure needed for large-scale testing efforts.

IS organizations must apply project management principles to Year 2000 quality assurance and testing efforts. Such an approach lowers overall risk by building in quality and establishing processes for verifying correctness of the Year 2000 deliverables.

This approach is described in three parts: defining Year 2000 compliance; assuring quality during project operation; and validating compliance through testing.

Defining Compliance

Defining compliance implements two key principles of project management: define the job in detail and agree on acceptance criteria. To obtain quality, Year 2000 project team members must know their final objectives and the criteria that will be used to judge whether those objectives have been met. Century-date compliance is not a single set of criteria that can be universally applied to every application in an IS organization’s software portfolio. Different levels of acceptable compliance are needed.

Balancing this need for flexibility in compliance criteria is the need for standardization to avoid chaos. Application teams must use consistent methods for the files and programs they share with other areas. This balance can be obtained through the use of compliance definitions, which are standard, written specifications of the elements selected to make an application compliant.

The Year 2000 implementation team uses the compliance definition as a guideline to determine their conversion strategies. Quality assurance efforts use it as a checklist to ensure all necessary activities have been completed. Year 2000 testing personnel use the compliance definition to guide test case development and as a specification for the types of tests and verification activities needed to certify final compliance.

Assuring Quality

IS organizations cannot rely solely on testing procedures to uncover errors in their Year 2000 implementation efforts. Few IS organizations have sufficiently robust testing infrastructures to perform the level of testing required for complete safety. Project overruns combined with immovable Year 2000 deadlines will cut into the time available for full-scale testing. Given these limitations, it is imperative that IS organizations devote their efforts to prevent the occurrence of errors rather than catch and correct errors, and they should do this early in the development cycle.

By applying quality assurance initiatives throughout the Year 2000 project, companies can conserve resources and focus testing efforts on areas of greatest concern, such as strategic applications. Additionally, QA processes provide documented evidence of a company’s efforts to avoid errors and compliance status. As the Year 2000 deadline draws closer, internal and external auditors will become increasingly concerned about the status of corporate Year 2000 compliance activities. Well-documented and enforced quality assurance processes can provide auditors with the information they need to assess project status and remaining levels of corporate risks and certify the adequacy of the remediation processes and the processes used to validate compliance.

With a few exceptions, the basic quality assurance practices used for a Year 2000 project are the same as those applied to any large maintenance project. If an IS organization is not currently using formal quality assurance practices, it should use the Year 2000 effort as an opportunity to implement and gain experience with these practices. The highly repetitive nature of Year 2000 tasks lends them to the types of benefits that can be achieved through total quality improvement techniques. Following are some of the more valuable techniques for implementing QA methods in Year 2000 projects.

Initial compliance testing. No piece of software should be assumed compliant unless it has been tested. Initial compliance testing should be performed as a screening method early in the Year 2000 project on any applications thought to be century-date compliant. This ensures there is enough time to correct any applications that fail the test.

Project checklists. The inventories collected during the Planning phase serve as an important quality assurance function throughout the project. The information should be stored in a database or repository along with status data for use as a QA checklist during each project phase. This inventory is used to identify interactions and dependencies between sub-projects, and it provides a checklist for ensuring all necessary components have been analyzed, corrected and tested.

Technical reviews. Technical reviews, such as peer reviews and technical walk-throughs, are valuable techniques for early error capture. Peer reviews are informal sessions where programmers exchange code for review by their peers. These reviews should occur regularly throughout any code changes. Technical walk-throughs are scheduled sessions where change strategies, source code, change logs and test plans are formally reviewed. These sessions should be scheduled as milestones are achieved.

Project reviews. Project reviews are essentially technical walk-throughs for project managers. These reviews concentrate on project management issues such as schedules, resource issues, interdependencies between projects, etc. These reviews are performed by a combination of senior managers and peer managers. The goal is to catch and correct project issues before they negatively affect project schedules.

Comparison of phase deliverables. Deliverables produced at each phase of the Year 2000 project have value for quality assurance efforts. For example, the assessment reports of impacted source code produced during the Strategy Development phase should be compared against logs of actual source code changes created during the Implementation Phase. Although differences are expected between the two reports, any discrepancies should have valid explanations. Consistent, large discrepancies indicate the need for process changes in analysis and/or implementation. These same logs should be used for a quality assurance check of the test cases produced for final validation.

Validating Compliance

The only method to help ensure full century-date compliance has been reached for a given application is extensive testing. This testing must ensure that the application’s operational environment (e.g., its hardware platform and operating system software) will perform correctly, the application will function correctly in the next century and through the transition between centuries and the application’s existing functionality has not been compromised by the compliance effort.

Building a Testing Infrastructure

The availability of a formal test environment greatly facilitates the verification of century-date compliance. The existing test environments in most IS organizations will have to be extended to handle Year 2000 projects. Fortunately, these extensions are reusable for future testing efforts and will lower the cost and increase the quality of all application testing. A proper test infrastructure is made up of software tools, test data, test scripts and a testing methodology. It handles four major functions: managing the test environment, creating test data, executing tests and validating test results.

Function 1: Managing the test environment. This function provides the foundation for managing and coordinating large-scale testing activities. It includes the tools and processes for test planning, test project management, problem tracking, script tracking, and test methodology management. If this function is not already present in the IS organization, it must be set up before the subsequent testing functions. This setup includes tool acquisition, hardware resource acquisition, skills development, and test library creation.

Function 2: Creating test data. Unless complete regression test libraries are available, test data must be created to support Year 2000 testing. This function encompasses all of the activities and tools required to create, manipulate and manage test data. Although Year 2000 projects have complex test data requirements, careful planning and reuse during test data creation can reduce the effort required to obtain adequate test coverage.

There are three major methods for creating test data: extracting production data, using a test data generator and manually creating test data. Extracting production data is typically the easiest approach, but often results in low levels of test coverage. Test data generators and capture/replay tools create test data based on input parameters or by capturing keystrokes. Initial setup can be laborious, but future test data is easily generated when needed. Manual data creation is too slow and laborious for use in building complete regression libraries, but is useful for building specialized test cases. Once created, test data can be reused for multiple types of tests. For example, regression test data can be forward dated for use in future date testing. Each year field in the regression test data can be incremented by 28 to move it into the future (e.g., 1998 becomes 2026). The use of 28 guarantees that all dates fall on the same day of the week.

Function 3: Executing tests. Year 2000 testing requires creating and executing test scripts to handle all application jobs and on-line functions. These tests must be executed in a variety of date environments, including the current date for regression testing, forward dates to ensure future compliance and century transition to handle boundary conditions. Depending on the needs of the IS organization, unit testing can be performed on workstations using environment simulators. Test scripts and specialized procedures for automating test runs greatly reduce the effort for those tests that will be executed multiple times. PCs and end user software can be validated by setting up special Year 2000 workstation environments. End users install their applications in these forward dated environments to ensure their software operates correctly.

Function 4: Validating test results. This function includes the tools and techniques used to evaluate the correctness of results produced during test execution. Validation includes ensuring that a migrated application conforms to the organization’s compliance standards, verifying that the correct components were modified, validating test output, and gaining user sign-off that testing was completed successfully. Validating the results of a Year 2000 testing effort is the most labor-intensive and error-prone activity of the entire century-date compliance effort. This function is highly repetitive and relies heavily on human effort to find important discrepancies in the test results. Automation should be applied wherever possible to reduce effort and increase accuracy. Compare utilities are the most common methods of automating validation activities. This approach falls short for Year 2000 testing where file formats may vary and uniform date differences are expected, but all other results must be the same. Intelligent compare tools can ignore some of these differences but cannot handle all situations. Validation efforts can be reduced by avoiding all non-Year 2000-related changes. This enables automated comparison of test results.

The Bottom Line

IS testing has always been difficult. In a Year 2000 environment, it is uniquely challenging and complex. The volume of the effort is tremendous, the technical environment is diverse, time is limited, skill requirements are broad, and resources are difficult to secure. However, a successful Year 2000 strategy is possible, provided it incorporates detailed planning, testing and quality assurance processes, and automated tools.

ABOUT THE AUTHOR:

Chuck Aquilina is Year 2000 Director for Keane, Inc., a $1 billion IT consulting firm and leading provider of Year 2000 compliance solutions.

Must Read Articles