London

81 Chancery Lane
London
WC2A 1DD
01235 821 160
View map

Edinburgh

1 Lochrin Square
92 Fountainbridge
Edinburgh
EH3 9QA
01235 821 160
View map

Dublin Office

77 Lower Camden Street
Dublin 2
Ireland
D 02 XE 80

View map

Administrative Office

BH Office
Church Street
Ardington
Wantage
Oxfordshire
OX12 8QA
01235 821 160
View map

Send us a message
CLOSE X
Contact Us
08.05.2020

APR Model Documentation: The Beginner’s Guide

When we think about the different kinds of work we undertake in our actuarial careers, it can be easy to underestimate the importance of documentation.  Whilst documentation isn’t necessarily everyone’s cup of tea, it is an important aspect of actuarial work which is reflected in both the Actuaries’ Code and the Technical Actuarial Standards.

We are passionate about documentation at APR and have built up considerable experience stretching from junior staff all the way up to our founding partner, Roger Austin, who led the development of a major insurer’s Solvency II documentation framework.

Many of us think about model documentation in the context of large, complex models and complying with regulatory standards, like Solvency II, where each company will have its own well-developed framework.

Many of the principles which underly these frameworks apply equally to smaller, less complex modelling projects.  For example, I recently built and documented a tool which was created to understand and explain valuation run results in a reporting team; and in a product governance project, a model which calculated remediation values for impacted policies.  Although both fell outside the clients’ formal model documentation structures, it was still necessary to produce good supporting documentation.

In this article, I consider how to approach documentation for these less complex modelling projects, a task which often falls to junior actuarial staff.  Typically, there is less formal guidance than for major internal models, and there can be a tendency to see documentation as an after-thought or a box-ticking exercise.  The article makes the case that effective documentation is a key element of the model, and this documentation should be created as the model evolves, rather than as an afterthought. We can think about what to include by asking the following question:

What do users, reviewers and developers of this model need to know?

Users of the model (current and future)

Before anyone starts clicking buttons and generating outputs, the first thing a user needs to know is: what should this model be used for, and indeed, what should it not be used for?  A clear statement of the problem or process that the model is built to help with, and any known limitations on its use, will help to prevent confusion further down the line.  Enough context should be provided for a colleague who has not been involved in the project to understand what the model does and why it’s needed.

Once the user has decided that the model is appropriate for their purpose, the documentation needs to explain clearly how the model is used.  For example, inputs might include some policy data being copied into a tab of a spreadsheet, in which case the user needs to know where to find that data and where to copy it into the model.  The user may then need to select some parameters in a separate tab, before “running” the model by recalculating the spreadsheet or using a macro.  Often even small tools built in Excel and VBA can end up being reused by people other than the original developer and straightforward instructions here can save a lot of time and difficulty for current and future users.

The documentation should then explain the model’s output.  This may be contained with the model as a tab in a spreadsheet, or the model may produce separate files. The user will need to know where to look and understand what they’re looking at.  The documentation should also set out openly the limitations and judgements that feed in to the model and affect the results.  Anyone using the results should have a clear understanding of how they are shaped by the assumptions made by the developer of the model, and factor this into their judgements about the results.

Model review

IFRS 17 EFRAG IssuesModel documentation should  facilitate straightforward reviews by others.  The first task here is to make sure it’s clear how the model works.  If it includes code, then that code should be well commented so that each line’s purpose is clear. If it includes spreadsheet work, then the various options – cell comments, a documentation tab, etc – should be used to make sure that it is similarly clear what each tab and each column is doing.  Crucially, no matter the format, documentation should be well signposted and easy to find – documentation is designed to assist users and reviewers, not frustrate them.

Creating this documentation as the model is being developed will not only make it clearer and easier to use for others in the future, but it can also help to clarify the original developer’s thinking as the model is created, and prevent “black box” situations, where a model seems to meet the requirements of the task, but the creator struggles to explain it to a reviewer or a manager, who will need to have confidence in its results.  Knowing that a clear explanation of what each calculation does will be required can be a strong incentive to make those calculations clearer within the model, by splitting them up into more manageable pieces.

Documentation should also explain how the model has been tested and validated, including a clear statement of any limitations.  To take the example of a standalone spreadsheet calculating remediation values, it may be possible to apply the corrected remediation calculations to a sample set of policies on the administration systems, and obtain a set of test cases to validate the spreadsheet calculation logic.  In this case the documentation should cover the source of these test cases and why they adequately validate the model.

Subsequent model development

In addition to using much of the information already mentioned, someone making further improvements or updates might want to know about the process the original developer went through in building the model.  Providing some context behind the modelling strategy, and describing any failed approaches or problems overcome while building the model, will save future developers from having to spend time rehashing the same ground.

The documentation should also describe the wider framework the model sits in. For example, to what extent are model outputs determined by downstream requirements – and therefore need to be maintained in the same format – and to what extent could they be changed by future developers if need be?  Are there any key stakeholders with an interest in this model, who would need to be consulted about any potential changes to the way it works?  This may be less of an issue with more informal or ad hoc pieces of work, but even these can be adopted into processes quickly, and teams which weren’t involved in the design or build may adapt their own work to fit the output the model provides, thus becoming stakeholders at a later stage.  This highlights the need for future users and developers to ensure documentation is kept up to date.  Our experience is that, despite the hopes of many an actuarial student, documentation is rarely a one-off task.

Recap

Hopefully this has provided a useful framework for smaller documentation pieces, and how the principles we are all familiar with when it comes to complex model documentation can be usefully applied to less formal work as well.  Thinking about the different users of your documentation, and planning to include the information they need, can elevate a good piece of work to a higher level by making it much easier and more flexible to use, and will be hugely appreciated by team members and managers both present and future.

Michael Scanlon

May 2020