Guest post by Dr Chris Tyler and Dr David Rose

Credibility is valued by Parliament, but what does ‘credible’ mean?

Share

22 April 2020 by Dr Chris Tyler and Dr David Rose

Credibility is valued by Parliament, but what does ‘credible’ mean?

An ESRC-funded study led by University College London with the Parliamentary Office of Science and Technology (POST) recently published a paper on how the UK Parliament sources and uses evidence, including how academics can better engage to maximise the chances for evidence-informed decision-making. The study was partially undertaken to strengthen interest in science advice to legislatures, a venue of important scrutiny of government policy and law-making, which is often side-lined in favour of studies of Executives (governments).

The study involved 157 people in Parliament (MPs, MPs’ staff, Parliamentary staff, Peers) through interviews and surveys and one researcher also conducted participant observation of committee meetings. We were interested in how and why evidence was used by decision-makers. Although we noted that parliamentarians defined evidence broadly, it was considered useful by almost all respondents. Factors such as accessibility (no paywalls), timeliness (evidence submitted to committee calls in a timely manner), and clarity (clear syntheses without jargon) were considered important by parliamentarians in determining whether a piece of evidence was used in decision-making. To this end, the paper makes a number of recommendations to improve academic engagement with Parliament, including making evidence open access, undertaking more syntheses of evidence, working with trusted knowledge brokers such as POST, and university and funding reform to prioritise the production of policy relevant knowledge and to support academics with the skills and time necessary to do good engagement.

Above all, however, credibility was identified as the most important factor determining whether or not a piece of evidence would be used. To the academic community, credibility of evidence would usually be judged based on the strength of a methodology which would be interrogated by peer reviewers. For parliamentarians, however, credibility was more often than not judged on the source of the evidence. Certain sources were perceived as having a political bias with an ‘axe to grind’, whilst others were viewed more favourably, including the House Libraries, POST, and independent bodies such as the Office for National Statistics. Credibility of source was also judged on the strength of personal recommendation.

This is not to say that the quality of methodologies was not appraised by parliamentarians, but 26% of survey respondents (MPs, MPs’ staff, parliamentary staff) self-reported that they would not feel confident in being able to perform such an appraisal with confidence, including 45% of MPs. Parliamentary staff, such as Library staff, were the most confident in doing evidence appraisal and clerks to committees and specialist advisers also performed such a role to help MPs and Peers interrogate the evidence. Specific forms of evidence, particularly statistics, were deemed to be the most objective, without much appreciation of the methodology behind how those statistics were produced.

The study argues that there are lessons for both academics and Parliament. For academics, it is clearly important to establish a credible reputation over the course of a scientific career and engage with Parliament in an objective and non-partisan manner where possible. The implication of the study is that parliamentarians would be warier of individual academics who were perceived as holding a personal position, although attempts are made to balance sources of evidence from across the political spectrum. Methodologies could also be communicated to Parliament in accessible language so that it is easy for the reader to understand what the researchers did and how they arrived at their conclusions.

For Parliament, there are clear lessons to be learned in terms of adopting practices to appraise the quality of evidence better. Of course, this should not mean that certain kinds of knowledge (e.g. scientific evidence) are automatically placed on a pedestal above other forms, since Parliament should be an inclusive venue. POST already provides training and opportunities to improve parliamentarians’ knowledge of evidence production methods, which is important because most MPs and Peers are rarely experts in such matters. A recent summary of COVID-19 modelling by POST and other initiatives such as POSTnotes can help boost understanding of an issue amongst parliamentarians, helping MPs and Peers know what questions to ask to strengthen scrutiny of government, and to understand the methods and uncertainty associated with different possible interventions. Pairing schemes, where policy-makers and scientists spend time with one another, might also be an avenue through which to stress the value of sound methodologies. But, all of these measures need a high level of buy-in from parliamentarians themselves and incentives in place to pursue such opportunities in a time-poor environment.

Ultimately, parliament needs to be using credible evidence to support its decision-making, not least to scrutinise effectively, and hence improve, government policy at this time of national crisis. Academics have a role to play in engaging with Parliament effectively, responding to calls for evidence, and presenting information in non-partisan, accessible ways. But, greater support is also needed within Parliament to help decision-makers appraise and use evidence more effectively.

This blog draws on research supported by the Economic and Social Research Council and the Houses of Parliament (including POST). The full open access paper, ‘Improving the use of evidence in legislatures: the case of the UK Parliament’ (2020) by Rose, D. C., Kenny, C., Hobbs, A., and Tyler, C., can be found here.


Dr David Rose is the Elizabeth Creak Associate Professor of Agricultural Innovation and Extension at the University of Reading.

Dr Chris Tyler is Director of Research and Policy in University College London’s Department of Science, Technology, Engineering and Public Policy (UCL STEaPP), where he leads STEaPP’s policy programmes and explores how policy makers use scientific evidence. Prior to joining STEaPP, Chris spent five years as Director of the UK’s Parliamentary Office of Science and Technology (POST) and before that was the first Executive Director of the Centre for Science and Policy (CSaP) at the University of Cambridge.