As part of CSaP’s Policy Fellow seminar series on government’s use of data, science and evidence, Bill Sutherland, Miriam Rothschild Professor in Conservation Biology, Department of Zoology, Cambridge, shared his perspective on the use of evidence and expert's advice in decision making, drawing on his research which looked at weaknesses in conventional models of expert elicitation.
Sutherland began the seminar by distinguishing between two types of cognition that people use when making a decision: type 1 cognition, which uses intuition and instinct usually in a short space of time, and type 2 cognition, which requires deep thinking over a longer period of time, reflection, and the use of specialist expertise and data to make a judgement.
“The problem is, when we have a type two problem, something that requires deep thinking, but we use type one thinking to deal with it.”
He highlighted the first course of action when governments are confronted with a problem - by ensuring that one has looked at the wide range of solutions. Sutherland explained this using the example of the coronavirus pandemic, where there was an attempt to identify the full range of possible solutions. These were considered by some governments at an early stage, to reduce the risk of blind spots before implementing policy.
Sutherland then highlighted the next challenge is actually making the decision when there are an abundance of solutions. Currently in government there are two primary routes used: either one expert decides; or a group of experts collectively agree on the best single course of action. While he notes that diverse groups provide better results, Sutherland suggests both methods are the “worst ways of making decisions” because there are “serious biases” in the way people think. He explains there are over 200 biases, but three common ones: overconfidence, when people say more than they know; anchoring, when the first to speak influences those who follow and the availability heuristic, information that a participant happens to have encountered over influencing the decision.
During the seminar, Sutherland enthused over the ‘Delphi technique’ and suggested that anonymised voting leads to far better outcomes, as reducing biases such when decisions are public. He then called for more systematic, objective, and rigorous systems that makes the process of evidence evaluation more consistent and reliable.
One of the participants challenged Sutherland’s analysis and argued there are biases present even before the evidence is evaluated, and possible solutions are drawn up from the way a question is framed. For example, in the following two questions: “How can the introduction of renewables be used to achieve net zero 2040 carbon targets?”, and “How can the phasing out of coal, oil and gas be used to achieve net zero 2040 carbon targets?” While both questions appear similar, they require completely different solutions.
Another participant then highlighted the practical challenges of decision making. It was argued that when scanning for all possible solutions, there are time constraints because most of the government’s work is reactive. It was also suggested there could be benefits if the Cabinet changed the way it makes decisions, but that is a political issue for further consideration. It was also raised that decision making should not solely be based on evidence; it should also consider how the decision will affect real people and places. This encouraged further conversations and ideas that some ministers could focus on solutions to support their political preferences and voters, when presented with evidence or evidence-based recommendations.
“A policy can live or die based on the presentation and the confidence of the presenter, in front of ministers.”
It was widely concluded that ministers need a much more standardised process when it comes to analysis and decision making. While there are benefits for further education on how to read and interpret scientific data, it is clear that decision making extends far beyond merely just examining the evidence.