News

How is Big Data research and analysis used to improve public policy and services?

22 September 2015

Share

Reported by Henry Rex, CSaP Policy & Communications Officer.

On 18 September 2015, CSaP teamed up with Cambridge Big Data and the Royal Statistical Society to run a workshop exploring how data science and ‘Big Data’ can help inform and improve public policy. The workshop was aimed at an audience of early-career researchers and civil servants.

The purpose of the workshop was not only to discuss the benefits of Big Data for public policy, but to offer advice on ways in which the academic and policy-making communities might best engage in order to build relationships based on understanding, respect and trust.

The first panel “How can research on Big Data contribute to public policy” was chaired by Hetan Shah, Executive Director of the Royal Statistical Society. Chris Nott (CTO Big Data & Analytics, IBM UK & Ireland) highlighted several powerful examples of how Big Data could inform policy decisions (such as mapping likely archaeological sites during infrastructure planning). Lorraine Dearden (Professor of Economics and Social Statistics, Institute of Education, UCL) focused on how governments could make better use of the vast quantities of administrative data that it holds. Using ‘Look After Children’ as an example, she demonstrated how linking data sets in government could deliver real societal benefits. Libby Bishop (UK Data Archive, University of Essex) explored the ethical issues surrounding government use of Big Data. Alongside the issue of privacy, there are also concerns around the providence, and quality, of Big Data, consent, accountability, and access.

Rob Doubleday, the Executive Director of CSaP, chaired a discussion on “How do policy makers make the best use of Big Data”. Sue Bateman (Deputy Director, Data Sharing and Data Science, Cabinet Office) described how the Cabinet Office is embedding data science into the civil service to improve services and operations. She highlighted 15 ‘alpha’ projects where her team are turning large data sets into useful tools and analysis for Government departments. She emphasised that the best results come from situations where there is a real business need for policy makers and the data analysis can tackle a specific problem. Philip Bradburn (Big Data analysis lead, National Audit Office) then described the work of the NAO and their use of data analytics, emphasising that effective communication is important when explaining how to use Big Data in policy formation.

Theresa Chambers (Head of Profession for Operational Research, Home Office) enthusiastically described some projects where the Home Office is using data science to good effect, and addressed the ethical problems that sometimes arise from them. She highlighted the Fedora Project to demonstrate how effective Government and Academic collaboration can be in this field, stressing the importance of trust, openness and communication. Ricky Taylor (Senior Economic Advisor, Department of Communities and Local Government) spoke about the Troubled Families Programme, the biggest data-linkage project in Whitehall, and described some of the advantages and limitations of using linked administrative data rather than collecting it.

In a session designed to give them a taster of how ministers are briefed and policy decisions taken, attendees worked in small teams to address a current policy issue – the national roll-out of care.data – and present their recommendations to a panel.

A final discussion followed, covering issues such as organisational change, inter-departmental data sharing, ethics, pilot schemes, and the need for a dedicated organisation to provide a safe space to continue discussions around Big Data.

(Banner image from Gord McKenna via Flickr)