Previous Document IconPrevious Info Memo

Next Document IconNext Info MemoExam Manual Table of Contents IconList of Info Memos

Informational Memorandum
Subject:Computer-Based Model Validation Expectations
Date of Memorandum:06/17/2002
Expiration Date:
Office:OE
Signed By:Smith, Roland
FCA Contact Person:Yowell, Gregory
Contact Phone:703-883-4371
List of Attachments:Model Validation Process

Informational Memorandum


June 17, 2002


To: The Chief Executive Officer
All Farm Credit System Institutions

From: Roland E. Smith, Director
Office of Examination

Subject: Computer-Based Model Validation Expectations


PURPOSE

This Informational Memorandum provides guidance to Farm Credit System (FCS) institutions concerning the Farm Credit Administration’s (FCA) expectations associated with computer-based model validation processes. The reliance on unvalidated models is considered a poor business practice. In addition, depending on the circumstances, examiners will consider the use of unvalidated models to manage risk as a potentially unsafe and unsound practice.

BACKGROUND

Computer models are often used to provide critical information used by management to support decisions in areas such as asset-liability management, risk measurement and management, loan pricing, valuing financial instruments, credit scoring, economic projections, capital allocations, and strategic planning. Therefore, sufficient controls must be established to ensure output from computer models is valid.

Computer models should be viewed as a tool to assist the decision-making process. However, their use requires considerable judgment and expertise in applying model results. Overly broad interpretation of model results, modeling errors, problems in the modeling process, or reliance on erroneous outputs can result in faulty decisions.
As models play an increasingly important role in decision-making processes, it is critical that board and management reduce the likelihood of erroneous model output or incorrect interpretation of model results. The best defense against such model risk is the implementation of a sound model validation framework that includes a validation policy and appropriate independent review.

____________________________________________________________________________________ SUMMARY OF COMPUTER-BASED MODEL VALIDATION EXPECTATIONS

The attachment to this Informational Memorandum describes in greater detail the expectations FCA examiners have when examining an institution’s model validation process. In summary, FCA examiners will review model validation policies, procedures, and practices to ensure that the following expectations are met:

(a) Decision makers understand the meaning and limitations of a model’s results. The impact that changes in assumptions have in final results must be understood, assumption changes should be documented, and data generated from the model must be useful to decision makers without concealing the model’s limitations.

(b) Model results are tested against actual outcomes, particularly when a model has been in use for a reasonable period of time.

(c) The institution should demonstrate a reasonable effort to audit the information entered into the model. Input errors should be promptly corrected.

(d) Staff involved in the modeling process should be qualified and their performance evaluated regularly based on established standards and specific job responsibilities. Training needs should be identified and monitored. There should also be sufficient depth in resources to accommodate unexpected absences of key personnel.

(e) To the extent feasible, model validation should be independent from model construction and maintenance.

(f) Responsibilities for the various elements of the model-validation process should be clearly defined.

(g) Modeling software should be subject to change-control procedures, so that developers and users do not have the ability to change codes without review and approval by an independent party.

If you have any questions about this memorandum, please call Gregory L. Yowell, Senior Capital Markets Specialist, Special Examination and Supervision Division, Office of Examination, at (703) 883-4371, or correspond on the Internet at e-mail address yowellg@fca.gov.

Attachment

____________________________________________________________________________________ Attachment
MODEL VALIDATION PROCESS

An appropriate model validation process should incorporate the model construction process; the information input component of the process where assumptions and data are supplied to the model; the processing component that contains the theoretical and mathematical constructs of the model; and the reporting component that translates the mathematical estimates into useful business information. Errors in any of these components can cause the model’s information to be meaningless or misleading.

Boards of directors need to ensure that models produce information that is timely, reliable, useful, and delivered at a reasonable cost. A formal board-approved model validation policy is an important step in ensuring that these goals are met.

Model Validation Policy

An acceptable model validation policy should provide for an independent review of all of the components of a model validation process. The personnel performing the model validation should be as independent as possible from the personnel who construct and maintain the model. When comprehensive independence is not practicable, the policy should explicitly provide for an effective communication process between modelers and decision makers. Model builders should provide clear and informative descriptions of modeling assumptions and limitations to senior management.

The responsibility for model validation should be formalized and defined. The model validation policy should specify that, where practical, the logic, design, and purposes of important models should be independently validated before they enter production.

For models of lesser significance or where cost/benefit considerations do not justify an effective independent review, the policy should require senior management to approve both the conceptual approach and the key assumptions for such models. Senior management should also verify that reasonable quality control processes are in place.

The model validation policy should require documentation for all important models that is adequate to facilitate independent review, train new staff, and where appropriate allow for a replication of the model being described. The documentation should include a description of the purposes and limitations of the model, provide an overview of the general procedures used to maintain the model, describe ongoing validation procedures, and describe the model construction process (including validation procedures and results). It might also be necessary to document the actual code needed to replicate the model if the original is destroyed or compromised.
____________________________________________________________________________________
2

The model validation policy should require that changes to models be subject to independent review and that controls be in place that ensure that only authorized parties can make changes in the code. The policy should clearly specify that internal audit personnel should ensure that those responsible for model validation adhere to the formal policy.

No matter how sophisticated or accurate a model is, its output won’t meet expectations unless the data entered into the model is accurate and timely. The adage that garbage in results in garbage out is still accurate. Hence, auditing data integrity is an indispensable and separate element of sound model validation, and should be explicitly included in the model validation policy.

Controls

Controls should be in place to ensure that information entered into the model agrees with data in the general ledger and accurately reflects the terms and characteristics of outstanding financial instruments and loan contracts. Automatic filters can be used to help identify input errors, and personnel independent of the modeling process (i.e., risk management or internal audit personnel) can be used to notify senior management of data problems. Controls have to be sufficient to alert decision makers when the data is unreliable or if additional resources need to be dedicated to providing high quality data. The validation process should ensure that reports generated by the model are accurate and that decision makers understand how the information being generated by the model be utilized.

Assumptions

Besides raw data, computer models require an array of assumptions. These assumptions may be derived from internal or external sources. The model validation policy has to ensure that decision makers understand the impact that changes in assumptions have on final results. The impact of changes in assumptions from one reporting period to another should be included in an assumption change log and modelers should be able to provide a clear rationale for assumptions. Important assumptions (like prepayments) should be routinely compared with actual portfolio behaviors to see if they need to be changed.

Model Construction Steps

The validation policy needs to ensure that the mathematics and computer code in models are error free. This can be done by using another model to validate the findings of the ones being tested. Tests of previous conditions can also be run to ensure that projected results approximate actual performance. Even if an institution uses vendor models, it should seek assurances that the models are defensible and work as promised.
____________________________________________________________________________________ 3
Modelers use theoretical concepts to draw relationships from the data in their models. The model validation policy should require an independent review of these theories. Modelers should provide clear descriptions, in nontechnical terms, of the theories underlying their models and show that these theories have received recognition and support from professional journals or other forums. Comparing the new model with others that are in existence is often useful for uncovering errors, confirming expectations, or at least in enhancing the understanding of the model being studied.

Validation of Accuracy

Many of the procedures used to validate the input and processing components of a model are also useful for validating the model results. At the time a model begins to produce outputs, model developers and validators should compare its results against those of comparable models, market prices, or other available benchmarks. Once in use, model estimates should continually be compared with actual results, a procedure often referred to as back testing. Many models, asset-liability models in particular, produce projections that are conditional upon the economic environment that actually materializes; over time, such conditional projections can be validated against actual outcomes.