Cognitive Bias and the Actuary
Cognitive bias is, roughly speaking, misinterpreting the objective world, due to the inadequacies, the scratches, the imperfections of a person’s own subjective lens. One common definition for cognitive bias is:
‘A cognitive bias is a systematic error in thinking that occurs when people are processing and interpreting information in the world around them and affects the decisions and judgments that they make.’[i]
Cognitive bias may be at work when you see a colleague who always seems to agree with another or how someone’s retrospective version of events feels distorted. They are errors in our judgement or thinking that occur frequently. Awareness of cognitive bias has grown in recent years. Books such as ‘Thinking, Fast and Slow’[ii] and ‘Predictably Irrational’[iii] have been bestsellers. There is a new focus on Behavioural Finance and professional actuarial bodies have included them in their syllabi. The role of cognitive bias in the workplace now occupies serious professional and academic thinking.
Cognitive bias can separate us from rationality. No one likes to think of themselves as irrational but, at least in some situations, most of us are. The first step to removing these biases is general awareness. We can then start to recognise scenarios where they could arise in our day to day lives and consider practical steps to take to minimise their frequency and risk. Following the above steps, in this article I introduce some of the types of cognitive bias, give a fictional case study from an actuarial setting where cognitive biases were likely at play, and end with some brief tips for preventing cognitive bias.
Types of cognitive bias
Below I introduce four types of cognitive bias that may be relevant in actuarial work, but there are many others.
Status quo bias
The mentality of status quo bias can be roughly grasped through sayings like ‘if it ain’t broke, don’t fix it’ or ‘don’t reinvent the wheel’. Status quo bias is actually much worse than those simple sayings:
‘Status quo bias is the inherent tendency of people to stick with their current situation, even in the presence of more favourable alternatives and even when no transaction costs are involved.’[iv]
Maintaining the status quo could perhaps be excused if there were some transaction costs or resource constraints discouraging a change. For actuaries, a balance between innovation and resource availability often needs to be struck in the workplace, however status quo bias can potentially lead to an erroneous assessment and make one err on the side of, well, the status quo.
The cognitive bias known as framing effects is defined as follows:
‘Framing effects refer to the way in which a choice can be affected by the order or manner in which it is presented’[v]
This phenomenon has many positive and negative consequences. Members of a team can discuss where framing effects exist in their current project and how it can be turned to their advantage.
For example, it can be leveraged by carefully crafting questions as a marketing technique, it may cause us to introduce unwanted bias in a survey or it could assist us in looking at questions from different angles.
Often conspiracy theorists or cult members don’t seem to be able to change their point of view. They usually believe they have a valid explanation for every detail on a matter. Perhaps they only pay genuine attention to cult leaders, fellow members and obscure websites to form and maintain their opinions.
However, confirmation bias is present all around us. People often feel a sense of satisfaction when they come across a view which supports their own and have a reluctance towards opposing ones.
‘Confirmation bias is that people tend to look for evidence that confirms their point of view (and will tend to dismiss evidence that does not justify it).’[vi]
The halo effect
Reconsider stereotypes, imagine the friend who seemed to get away with everything growing up, the charismatic politician who has the whole world fooled or the person with the gift of the gab who can talk themselves out of anything.
‘The halo effect is a cognitive bias that occurs when an initial positive judgement about a person unconsciously colours the perception of the individual as a whole’[vii]
Innocence or ignorance can make us victims, and the manifold of cognitive biases, pre-conditioning and prejudices and so on open us up to all sorts of manipulation. For example, the halo effect could allow an abusive relationship to continue for longer than it should.
We now look at a fictional actuarial case study, then think about possible cognitive bias contributors and some preventative measures.
An actuary developed a complex yet flexible model to value the liabilities of a with profits product which had various options and guarantees. It is used both for individual policies and at a portfolio level, and it was coded and thoroughly signed off 25 years ago. The actuary is now a senior manager and tells junior colleagues how proud the business is of his model. New analysts don’t need to be familiar with the coding language to use it, they input the parameters and hit ‘run’. The impacts of changing the parameters and assumptions on the results are checked and signed off by senior staff. It is quick, easy to use and accurate.
The product was very popular but it has since been discontinued. Many of the policyholders are claiming or exercising their options and/or guarantees. Whenever there is a material movement due to adverse claims, actuaries can validate and sense check that the movements in the modelled liabilities are reasonable, after adjusting the assumptions and/or parameters.
Regulatory changes now require the model output to be changed. This has forced the company to find another solution. It costs the company millions of pounds to find staff and contractors with the skill set to understand the historic product and build a new model.
When building the new model, some material issues with the methodology of the previous model were discovered. They were likely to have been immaterial at first, but with the current business mix, they are material. The change is causing a loss on the financial statements in addition to the costs incurred to make the change. This is all at a time when the company is faced with losses elsewhere in the business and there is increasing shareholder concern. The previously prized model is now creating havoc. To the actuary who first developed the model and other senior colleagues across the business, though warning of the regulatory change came well in advance, the issues necessitating the handover to a new model came to them as a total surprise.
If the issues were appreciated before the last minute, work could have commenced at a rate that would have facilitated a cheaper handover with fewer losses.
Cognitive bias that may have contributed
It’s quite difficult to say an isolated error in judgement led to a situation like this, but there were likely several missed opportunities to prevent or discover the issues earlier. We will focus on the cognitive bias aspects in trying to prevent this outcome. There are many weapons in an actuary’s arsenal to deal with these risks.
There is obvious status quo bias at play with people accepting the original model. For example, making trusting assumptions of the sign off processes behind them. Also, there could have already been suspicions that a flaw/issue would likely present problems later on, but resource constraints may have swayed in favour of complacency. Most involved would have been faced with conflicting priorities and heavy workloads and maybe at a subconscious level, some had the need to get through tasks rather than look to ‘reinvent the wheel’.
There is likely some confirmation bias when changing parameters and sense checking results. It isn’t completely unreasonable for junior staff to rely on sign off processes to satisfy themselves that the model is correct, but more experienced staff should be able to look with a more independent view. Seeing there were controls in place by users of the model may have been all the evidence needed to accept things were fine.
Perhaps the issue was a known simplification when the model was first built. Given that the issue went undetected, likely by auditors, risk teams, users of the model and others for so long, this could suggest a lack of understanding of the product and model. Having only a handful of people working on the same thing for a prolonged period increases the risk of confirmation, status quo and group biases. On the other hand bringing new people into a process too frequently can make it inefficient, so a balance must be struck between familiarity and having fresh eyes from time to time.
The halo effect is likely at play if the actuary who first developed the model was impressive in other areas. The effect would probably have continued with new members of staff looking up to the actuary’s progression and their conversations on the model. Perhaps this is less of an issue the more removed staff are from the actuary, if they perhaps only communicate in writing or maybe have never spoken at all. Nevertheless rumours or hype may still persist and these can affect staff behaviour.
There are many more cognitive biases than those mentioned in this article that have been discovered and potentially contributed to the scenario in the case study. For example, there could be elements of anchoring and adjustment when reviewing results, cognitive dissonance when the issues were first discovered, elements of herd behaviour and hindsight bias. This sort of analysis is quite subjective, and a lot of different insights could be made.
Tips for preventing cognitive bias
First and foremost, as described by Daniel Kahneman[viii] (Nobel Prize winner in Behavioural Economics), for all of these biases we can think through and analyse our behaviour slowly and methodologically to make us less susceptible to their effects. Clearly this cannot be done at every instance, but we depend on our training and experience to develop these habits.
Processes actuaries already have in place such as: clear and thorough documentation, independent reviews and validation, do/check/review processes, risk management and auditors, well designed spreadsheets etc. are all very good methods to give us time to pause and prevent these issues from occurring. The ‘Dinosaur Model’ case study is obviously a fictional example but, in reality, things will still sometimes fall through the cracks.
As a countermeasure to confirmation bias, individuals can deliberately reverse their mindset and actively seek out evidence that does not support their current view. This is likely the attitude of risk teams and auditors, but it is important for those closer to the work to have this mindset too.
Changing the staff working on the project and asking them to understand and review the model or process fully and redocument it might help identify issues and prevent status quo and halo effect bias.
Framing effects could easily be used to our advantage if we ask ourselves or discuss in small groups equivalent questions presented differently.
Research on cognitive biases is a relatively new field and human behaviour is not a scientific phenomenon we can easily observe. Often the research and statistics can be inconclusive or even contradictory. For example, research has been carried out to determine the priority of arbitrary choices that an average group will select based on the order of their presentation. The existing studies show great diversity in their conclusions, which vary from priority being given to the first option presented, the last option presented and the intermediate options presented[ix].
Despite the inherent difficulties we face in studying humans’ unpredictable and irrational behaviour, people are using the knowledge of these biases to their advantage in fields like Behavioural Finance. An abundance of cognitive biases have been identified, with some more relatable than others, but the research in this field is still developing. There is no established method to eliminate their occurrence from our thought processes, but I believe contemplating how we may succumb to and prevent them is a step in the right direction. I feel that having some appreciation of what cognitive biases are and, ideally, how to manage them is incredibly beneficial in both personal and actuarial life.
[ii] Kahneman, D. (2011) Thinking, Fast and Slow. New York: Farrar, Straus and Giroux
[iii] Ariely, D. (2008) Predictably Irrational. London: HarperCollinsPublishers
[iv] IFoA. (2019), CM2 Financial Engineering and Loss Reserving: Core Reading
[viii] Kahneman, D. (2011) Thinking, Fast and Slow. New York: Farrar, Straus and Giroux
[ix] IFoA. (2020), SP5 Investment and Finance Principles: Core Reading