Principle three – Banks have implemented a robust model development and implementation process to ensure appropriate use of models

Under SS3/18 covering model risk management principles for stress testing, the PRA provides what is essentially common-sense high-level guidance on the model development process. They have not provided a detailed how-to guide - and nor should they, there is plenty of detailed guidance available, including our own best practice guideline published with the ICAEW. Instead they have addressed the model development process at a principles level, and we comment on and amplify their nine principles with some practical suggestions below.

1. Model purpose and design
Start with the basics: what is the model for? You should consider what question you are seeking to answer with the model, and check that it is possible to answer that question in view of the available data and mathematical theories. At a practical level, document the model specification thoroughly and have it reviewed and signed off by the model sponsors, so that there is an agreed common understanding of the task. As the model is developed it should be reviewed and checked back against the specification.

2. Use of data
The old adage 'garbage in garbage out' applies sharply here.  We have found data quality issues to be a major constraining factor between reality and aspiration in model design.  As far as possible make sure that the data set is complete, consistent and robust. Test the model logic thoroughly to ensure that edge cases are processed correctly and/or that actual and potential data errors are flagged for user resolution. We find it helps to have nominated data owners for each assumption or data set and to use a managed change control process to ensure that data is also internally consistent (e.g. contemporaneous data extracts) as between different data sets.

3. Testing
Develop a rigorous test plan, use edge cases, extreme values, simple sensitivities zero testing and limit testing to stress the model and identify anomalous results. We find that variance analysis of the differences between a test case (or sensitivity) and the base case is a great way of understanding whether a model is behaving directionally and proportionally as expected.

4. Documentation
User documentation is key, ideally based within the model and contextual, to help users understand how to drive a given model.  Documentation should be clear and unambiguous to avoid scope for misunderstanding between users; it also provides a measure of protection against key man/concentration of knowledge risk.  Well-designed models can be self-documenting to an extent. A good spreadsheet model, chunked down into bite size calculations should ideally be capable of passing the pen and calculator test: i.e. could you print the calculations and check them manually?

5. Use of judgement
In developing models, organisations need to be careful not just of explicit judgements (e.g. identified values), but also implicit judgements (decisions on methodology or calculation bases).  Any such judgements should be clearly documented but also tested. We would advise limit testing to test by how much an assumption can shift before the conclusions of a model or piece of analysis change.

6. Supporting systems
Where models are integrated with or within other systems, it is critical to make sure that they are fully compatible. It is worth making sure that treatment of floating point arithmetic is consistent and that data types, units and sign conventions are translated between interfaces accurately and consistently.

7. Business involvement
The PRA are spot on, often the people best placed to test and challenge a model are the sponsors or business users, not the model developer or even independent reviewer in some cases. Experience has shown us that a business expert’s intuitive understanding of a financial product will flush out anomalies and errors in a model quicker than self-review by the developer.

8. Model uncertainty
Uncertainty is intrinsic to forecasting financial performance. While there are stochastic and probabilistic techniques available to modellers, these are complex and often providing only risk-spurious accuracy. Frequently, businesses do not have the depth of understanding of the correlation of business drivers or statistical capability in-house to use these techniques properly. We therefore sound a note of caution and a suggest that simple sensitivity testing, scenario testing and limit testing are often a better place to start.

9. Monitoring
The PRA highlights the importance of periodic monitoring of model performance. In our experience, this is frequently overlooked by clients. It can range from simple variance analysis against plan, through bridge analysis, to back testing using actual performance data. These are powerful techniques for testing and validating the models assumptions (both explicit and implicit) and should be undertaken in a disciplined and regular manner.

One area the PRA supervisory statement doesn’t really cover is version control and audit trail. In developing a model, special attention should be given to maintaining the integrity of the master model and avoiding proliferation of multiple sister models. Data points associated with model changes and development iterations of the model should be captured (ideally automatically) in an audit log to help trace issues and errors and provide a development audit trail.

Principle 3 covers a lot of ground very quickly but bears slow and careful consideration as the consequences for getting models wrong can be costly. In August this year, Transamerica Entities affiliated with Aegon USA Investment Management LLC (AUIM) paid out nearly $100m to investors, who according to the SEC’s order, put billions of dollars into mutual funds and strategies using the faulty models developed by investment adviser AUIM. The order also found that the models, which were developed solely by an inexperienced, junior AUIM analyst, contained numerous errors, and did not work as promised.

Please do get in touch with Alistair Hynd or Jon Pepper if you have any questions around financial modelling and the associated model risks which we can help you mitigate.  

Related industries

Model risk

Back in April, the PRA published its Supervisory Statement (SS 3/18), setting out its expectations as to the model risk management practices that firms should adopt when using stress test models. We explore what each of the principles mean, and more importantly what the pitfalls are, and how you can make sure you don’t fall foul of the regulator.

Find out more