Previous Document IconPrevious Info Memo

Next Document IconNext Info MemoExam Manual Table of Contents IconList of Info Memos

Informational Memorandum
Subject:Testing for Year 2000 Readiness
Date of Memorandum:05/20/1998
Expiration Date:
Office:OE
Signed By:Smith, Roland
FCA Contact Person:Glenn, Thomas
Contact Phone:703-883-4412
List of Attachments:None



Farm Credit Administration 1501 Farm Credit Drive
McLean, Virginia 22102-5090
(703) 883-4000

INFORMATIONAL MEMORANDUM

May 20, 1998



To: Chairman, Board of Directors
Chief Executive Officer
All Farm Credit System Institutions

From: Roland E. Smith, Chief Examiner Roland E. Smith
Office of Examination

Subject: Testing for Year 2000 Readiness


During the past 2 years, several Informational Memorandums have been issued that relate to the Year 2000 problem, including a March 17, 1998 Informational Memorandum titled “Expectations for Testing of Mission-Critical Systems.” The March 17, 1998 Informational Memorandum established key milestone dates for testing mission-critical systems. This Informational Memorandum compliments that earlier document and provides additional guidance on the testing phase.

Testing may be the most critical phase of the Year 2000 readiness process. The purpose of testing is quite simply to ensure that remediation efforts work effectively. Failure to conduct thorough testing may mask serious remediation problems. Not identifying or correcting those problems could threaten the safety and soundness of an institution. Ultimately, each institution is responsible for ensuring its readiness for the Year 2000.

TESTING FOR YEAR 2000 READINESS

In recent guidance issued by the Federal Financial Institutions Examination Council, testing mission-critical systems was estimated to consume 50 to 60 percent of the time, funding, and personnel needed to make a financial institution Year 2000 ready. Although the percentages may differ for institutions examined and regulated by the Farm Credit Administration (FCA), testing will still likely require a substantial commitment of resources. Institutions must effectively manage the Year 2000 testing process, regardless of how individual systems are developed and operated. In practice, the controls necessary to manage the testing process effectively will differ depending on the design of the institution's system, interfaces with third parties, and the type of testing used.

Farm Credit System (FCS or System) institutions should test mission-critical systems first, as the failure of mission-critical services and products will have a significant adverse impact on the institution's operations and financial condition. Each system and application should be evaluated and tested based on its importance to the institution's continuing operations and the costs and time required to implement alternative solutions. System institutions should also obtain sufficient information from their mission-critical service providers and software vendors to determine if they are able to test products and services for Year 2000 readiness. The failure of these service providers and software vendors to test their products and services in an adequate and timely manner could pose a risk to the safety and soundness of an FCS institution.

FCA supports the employment of user groups for testing as a means to control costs and to expedite the testing process. Therefore, FCS institutions with common needs and similar mission-critical systems may join forces to evaluate the performance and testing methodologies of service providers and software vendors. By working through user groups, institutions can gather and share information on testing methodologies and results of user group evaluations of service providers and software vendors tests.

The extent to which institutions rely on third parties to design, implement, and manage their systems will affect the extent of an FCS institution's involvement in testing. Institutions that outsource all of these functions will have less extensive involvement in testing than those System institutions that perform some or all of their own programming or processing in-house.

No single approach to testing for the Year 2000 problem is appropriate for all entities. Testing options range from testing within a System institution's own environment to "proxy" testing. The appropriate testing method to employ will depend on a variety of factors, including whether the testing is being conducted on software or services received from third parties, as well as the type of system or application to be tested.

Testing Methodologies

Each institution should determine the types of tests it will perform based on the complexity of its systems, the level of its Year 2000 risk exposure, and its reliance on third parties for computer-based products and services. Moreover, in addition to testing a particular product or service, institutions should conduct testing between systems and products that interface both internally and externally. The following are examples of testing processes that could be employed:

Baseline tests are performed before any changes are made to a computer program or application. The baseline test helps an FCS institution compare performance of the system after changes are made to it.

Unit tests are performed on one application to confirm whether remediation efforts yield accurate results for that application. They do not test how well the application will perform with other applications.

Integrated tests are performed on multiple applications or systems simultaneously. Integrated tests confirm whether computer programs function properly as they interact with other programs.

Regression tests verify a remediated system against the original system to ensure that errors were not introduced during the remediation process. Regression testing should be applied to both the remediated portion and the unchanged portion of the system.

Future date tests simulate processing of renovated programs and applications for future critical dates to ensure that those dates will not cause program or system problems.

User acceptance tests are performed with users and validate whether the remediations have been done correctly and applications still function as expected.

Point-to-point tests verify the ability of an institution to transmit data directly to another entity or system.

End-to-end tests verify the ability of an FCS institution originating a transaction to transmit test data to a receiving entity or system through an intermediary.

TESTING PLAN

As previously communicated by FCA, System institutions must have a testing plan completed by June 30, 1998. The testing plan should employ a testing strategy and set testing priorities based on the risks that the failure of a system may have on operations. The objective of the Year 2000 testing strategy should be to minimize business risk due to operational failures.

The testing plan should also provide for testing of both internal and external systems. Internal systems may include software, operating systems, mainframe computers, personal computers, reader/sorters, and proof machines. Internal systems also may include environmental systems, including heating and cooling systems, vaults, security systems, and elevators. External systems may include services from service providers and any interfaces with external entities.

Management and staff are expected to have the knowledge and skills necessary to understand and effectively manage their Year 2000 testing efforts. Management should identify special staffing and training needs for personnel involved in testing. They also should determine how they will allocate resources and, if necessary, hire and train employees to run and analyze tests. Examiners will evaluate testing efforts by reviewing a System institution's testing strategies and testing plans to ensure that it can meet key milestones addressed in the previous FCA Informational Memorandum --“Expectations for Testing of Mission-Critical Systems”--dated March 17, 1998.
Elements of a Testing Plan

FCA examiners' review of an FCS institution’s testing plan will focus on certain elements that should be addressed in the plan. Those elements apply to institutions that test internally developed systems, as well as institutions that test with service providers and software vendors.

Testing Environment: Considerations for an appropriate test environment should include whether to partition current operating computers, by setting aside one or more sections to be used only for testing, or by using a separate computer facility to test. If the institution uses either a separate computer facility or the computer at its contingency site, it should consider how all interfaces, both internal and external, will be duplicated and adequately tested. Management should evaluate whether the test environment has sufficient computing capacity needed to complete the testing plan.

Testing Methodology: The plan should address the types of tests for each application and system. See "Testing Methodologies" above for a description of various tests.

Test Schedules: The plan should identify when software and hardware will be tested, including interfaces between systems. Test schedules also should be coordinated with the test schedules of third parties.

Human and Financial Resources: The plan should include budget issues, as well as a description of the participants to be involved in testing (e.g., the information technology staff, end-user, and external parties).

Critical Test Dates: There are certain critical dates that need to be tested for each mission-critical system. If an FCS institution's systems or applications fail to operate properly when tested for these critical dates, management must determine whether remediation and subsequent testing can be completed successfully, or whether contingency plans must be implemented. Critical dates may vary for a variety of reasons. Because additional dates may be critical for any given institution, each institution should test for any other dates it deems critical. At a minimum, FCS institutions should test for the following dates, including the "rollover" or progression before and after these dates, to ensure that applications and systems will operate properly:

Documentation: Each FCS institution should maintain written documentation supporting every stage of the testing process. This documentation provides an audit trail and should facilitate correction of problems if they occur. The documentation should include the following:

- Types of tests performed (e.g., baseline, unit, regression, etc.); - Explanation of why an institution chose the tests that it performed and how extensive those tests were; - Results of tests; - Criteria used to determine whether an application or system is deemed Year 2000 ready; - Plans for remediating and retesting any computers, systems, or applications that failed Year 2000 tests; and - Individuals responsible for authorizing the testing plan and accepting testing results.

Testing Internally Developed Systems

FCS institutions with internally developed systems should establish a formal process for testing these systems. The institution should test mission-critical systems first. When internal expertise is unavailable, management should retain appropriate external technical expertise to test and evaluate test results. In addition, FCS institutions should conduct testing between their own internal systems and any interface with external entities.

Testing with Service Providers, Software Vendors, and Other Third Parties

System institutions should coordinate and implement (where appropriate) test plans to address the testing with service providers, software vendors, and other third parties as discussed in the section on "TESTING FOR YEAR 2000 READINESS." The following are options for testing with service providers, software vendors, and other third parties.

Service Providers: Although it is preferable for FCS institutions to test the full range of applications provided by service providers, if certain conditions are met, the results of proxy tests may be acceptable. In proxy testing, the service provider tests with input from a representative sample of institutions (user groups) who use a particular service on the same platform. Test results are then shared with all similarly situated clients of the service provider. The FCS institution is still responsible for assessing testing results provided by service providers to determine whether the institution can rely on the proxy test results.

Software Vendors: Software provided by software vendors should be tested in the institution's own environment, whenever possible, because this is the best indicator that their systems are Year 2000 ready. Such testing can be done in a variety of ways, including obtaining a testing package from the software vendor and testing within the System institution's own test environment. Any interfaces with the vendor-supplied applications also should be tested within the institution's own testing environment to confirm that, when used together, they will function properly. If the institution is unable to test wholly within its own environment, testing can be conducted at a contingency site or a disaster recovery "hot site." The contingency site is a separate facility configured with identical or similar hardware used by the institution to process transactions and produce records if the institution's own environment becomes inoperable.

Other Third Parties: System institutions should test their mission-critical applications with material third parties to whom they transmit or from whom they receive data. Other third parties may include System institutions, other institutions, payment system providers, clearinghouses, customers, and, to the extent possible, utilities.

VERIFICATION OF TESTING PROCESS

FCA expects institutions to critique their own Year 2000 tests to ensure that the tests are effective and key dates are checked. When an FCS institution lacks internal expertise, management may use other qualified professionals, such as management consultants or Certified Public Accounting firms, to provide an independent review. In those instances wherein most or all of an institution's services are provided by vendors or service providers, management should ensure that the vendors have performed reviews similar to the type described here, and management should receive results of those reviews. FCA examiners will review the institution’s evaluation of their testing process and, based on the determined quality of the evaluation, set the examination scope accordingly.

In addition to ensuring that existing systems will function properly for critical dates, management must ensure that all new applications, operating systems, software, and hardware are Year 2000 ready before installation. FCS institutions should test all systems, products, and services, regardless of when they were upgraded or purchased.

ADDITIONAL INFORMATION ON TESTING

There are other sources of information on Year 2000 testing in addition to this document. System institutions may benefit from researching websites maintained by their software vendors, service providers, and others. For example, the United States General Accounting Office's (GAO) "GAO Year 2000 Guidelines" includes checklists that institutions may find useful. The guidance can be obtained from the GAO or from their website (www.gao.gov). The Federal Financial Institutions Examination Council’s website (www.ffiec.gov) also provides useful information for System institutions on Year 2000 testing.

If you have any questions regarding this document, please call Thomas M. Glenn, Director of Operations, Office of Examination, at (703) 883-4412. Contact can also be made at the following E-Mail address: glennt@fca.gov.