How can machine learning techniques improve our understanding of climate change risks in the future?

16 February 2017


Reported by Anna Fee, NERC-funded CSaP Policy Intern (January-April 2017)

Dr Emily Shuckburgh is Head of Open Oceans at the British Antarctic Survey. In her talk, she explained how machine learning techniques can be used to improve climate model predictions and provide a more sophisticated understanding of climate change risks in the future.

Dr Shuckburgh reminded us of the extreme weather events the world has experienced in recent years, from Hurricane Sandy and the damage caused to the New York subway, to flooding in the UK and the severe heatwaves across Europe which, in 2015 caused 70,000 premature deaths.

The latest climate change risk assessment released by the UK government states that future climate change could pose risks to global food production and water supplies, biodiversity and ecosystems, in addition to the risks associated with newly emerging pests and diseases.

Climate models can predict the impacts of climate change on global systems but they are not suitable for local systems. The data may have systematic biases and different models produce slightly different projections which sometimes differ from observed data. A significant element of uncertainty with these predictions is that they are based on our future reduction of emissions; the extent to which is yet unknown.

To better understand present and future climate risks we need to account for high impact but low probability events. Using more risk-based approaches which look at extremes and changes in certain climate thresholds may tell us how climate change will affect whole systems rather than individual climate variables and therefore, aid in decision making. Example studies using these methods have looked at the need for air conditioning in Cairo to cope with summer heatwaves and the subsequent impact on the Egyptian power network.

The background risk of having hot summers across Europe was 1 in 1000 years. By the end of the 20th century it was 1 in 50 years and today it is estimated to be 1 in 5 years.

Collaboration with Google DeepMind; an artificial intelligence company specialising in applying deep learning and neural networks to real world problems, could employ machine learning frameworks to use information from historic climate data in climate model predictions and hopefully gain a more sophisticated understanding of the future impact.

Dr Thomas Walters, Senior Research Scientist at Google DeepMind, explained that applying Gaussian approaches to historic data allows you to extract as much information as possible from small amounts of data. Deep learning techniques in contrast, are suitable for very large datasets used for climate modelling and allow access to non-linear correlations between data separated by time and space.

A Q&A session followed which you can view below.

Banner Image via Flickr