In-Depth

Testing the "Right Stuff" with InCert Examiner

Instead of trying to figure out just how much testing is enough, the quality assurance department at the Securities Industry Automation Corporation (SIAC) identifies and tests only the "right stuff" – the exact operational functions of the COBOL programs running in its S/390 environment. SIAC is the central resource providing key systems support for the New York and American Stock Exchanges, the National Securities Clearing Corporation and the securities industry nationwide. At the heart of the financial industry, SIAC fully understands the necessity of accurate, quality testing, as well as the need for the efficient turnaround of new production enhancements demanded by rapidly changing business requirements.

"There’s little time to try to devise different types of testing scenarios based on guesswork," explains Senior Quality Assurance Manager, Saul Kaminsky. "We need to build our test plans based on facts, and those facts are the functions that are performed by the program."

Using InCert Examiner from InCert Software Corporation (Cambridge, Mass.), SIAC can monitor the actual lines of code exercised by the COBOL programs in its S/390 environment, gain the understanding needed to test the "right stuff" and eliminate the uncertainty of guesswork.

Heavy Responsibilities

The quality assurance department at SIAC is responsible for validating the accuracy of all production deliverables. The group puts all programs and applications through the paces of functional testing, regression testing, performance testing and acceptance testing. On top of all these activities, the group also performs crucial requirements-based testing for applications, as well as up to several days-worth of production parallel testing for results comparison and volume validation.

However, with a large portfolio of legacy COBOL programs, the task of identifying exactly what functions needed to be tested continued to grow increasingly complex – especially when it came time to match requirements with code. One of the driving reasons for the increased application complexity was the fact that a large number of the legacy programs were almost 30 years old and, in a situation experienced by most companies with a large legacy portfolio, virtually all of the software engineers in the original programming staff had either retired or moved on to other positions within the company. This resulted in a severely depleted knowledge base.

Further complicating matters were the extensive modifications performed throughout the years. Typically, these modifications were made to satisfy tactical business requirements and, instead of resulting in major redesign or rewrite, adhered to the old adage of "if it ain’t broke, don’t fix it" – efforts remained concentrated on changes to specific sections of code within the existing program.

On top of these challenges, most of the SIAC programs throughout the years have followed the traditional upgrade path from OS COBOL to VS COBOL-II to COBOL for S/390.

"Over time, functionality has been continually modified, enhanced and expanded," says Kaminsky. "There no longer exists any person who can accurately describe the true role of each section of each program. Multiply that by the number of programs in our portfolio, and you can find yourself in an investigative nightmare."

Regardless of the challenges of discovering true program functionality, the requirements for effective testing and high application quality remained stronger than ever: Just think of the billions of dollars worth of trading that occurs each day on the floors of the various exchanges supported by SIAC. Plus, the business-critical systems driven by the legacy code consistently demanded the efficient implementation of production enhancements.


Pinpointing Functionality

Without an automated method to determine exact program functionality, SIAC was strapped to largely manual methods for analyzing legacy COBOL programs. For some companies, this can result in the "highlighter, paper clip and antacid" technique used to review stacks of paper-printed source code. In other instances, code can be manually reviewed and analyzed through TSO/ISPF. Unfortunately, neither of these methods provide interaction with the program in question, nor do they provide insight on exactly how a program behaves during operation.

Other situations can find analysts employing powerful COBOL debugging tools as an aid to analysis. While these types of tools help provide an overview of program flow and functionality, overhead constraints typically limit the operation of these tools to the test environment, using test data typically copied from production, albeit on a much smaller scale.

"Garnering the statistics from the production environment over a period of time helps build the baseline needed to identify explicit program functionality," explains Kaminsky. "Even with our other testing tools in place, we needed a way to improve our test cases and further enhance the value of those tools."

Understanding Code Usage

The implementation of InCert Examiner has simplified testing and reduced complexity for the SIAC quality assurance department. By generating the code coverage information needed to understand the exact functions within the program, test plans have grown shorter, yet more comprehensive and accurate. Also, for the COBOL programs in the S/390 environment, the company has effectively eliminated one of the largest challenges associated with testing – the discovery and creation of accurate test data.

InCert Examiner contains a binary instrumentation utility that embeds InCert Agents into the load modules. This binary instrumentation introduces a single additional instruction per complete block of code with no associated I/O. Instrumentation occurs after an application has been compiled and linked and requires no source modifications, eliminating any potential source management issues of maintaining instrumented versus non-instrumented code.

According to Kaminsky, the instrumentation at the load module level presented the best-fit solution for SIAC – especially because of overhead considerations and the desire of the organization to monitor the production environment.

"We looked at other solutions that performed instrumentation at compile time and found that some increased overhead anywhere from 15 percent to 40 percent," he says. "That’s probably okay for testing, but it’s obviously unacceptable for monitoring production. Examiner overhead is approximately one percent in tested cases."

In fact, benchmark results produced by SIAC have revealed that overhead execution in some cases actually dropped after the load module had been instrumented.

"The results clearly indicate that aside from no additional, noticeable overhead being created by the instrumentation of the code, some of the jobs executed with less CPU time than the non-Examiner jobs," says Kaminsky. "This indicates that the primary difference in times is likely ‘noise’ on the CPU during execution time."

Reducing Test Requirements

Once a load module is instrumented, InCert Examiner monitors the lines of code exercised during execution and dynamically collects execution statistics. This enables the quality assurance department to immediately pinpoint exactly which lines of code are executed in production and which lines of code should be tested.

By understanding which lines of code are required for testing, a quick review of the program can reveal exactly what type of test data is required. In one instance, SIAC was able to reduce the size of a test bed from 1.4 million records to a mere 4,000 because Examiner helped identify the exact records required to perform validation.

Equally important, Kaminsky contends, is the ability to quickly identify sections of a program that have not been exercised. While it’s always possible that the non-executed section of a program could contain dead code, it’s equally possible that it could be a highly critical section of code that didn’t execute because specific conditions were not met during the test. "Several times we’ve run into situations where a section of code was not executed because the data was not available," says Kaminsky. "Examiner helped us avoid a painstaking manual analysis of any number of different source modules."

While initially implemented on a single DB2/CICS/batch application, SIAC is in the process of expanding usage of InCert Examiner to encompass its entire production, test and development COBOL portfolio. Furthermore, the steps needed to instrument load modules will be embedded with the company’s existing change management software so that the process occurs automatically as programs move throughout the entire system development life cycle in its S/390 environment.

"The results of Examiner are commendable," concludes Kaminsky. "It enables us to maintain the pace needed to validate production changes while increasing the quality of our legacy COBOL portfolio with reduced testing complexity."

About the Author: Philip E. Courtney is a marketing consultant and technical journalist. He can be reached at (602) 684-2854, or by e-mail at www.philcourtney.com.

Must Read Articles