EXtremes and Rare Events in Climate and related applications

3 July 2024

University of Reading

Abstracts

Ignacio Del Amo Blanco

Peaks over threshold method for dynamical systems in climate science

Recently a Peaks over Threshold method based in Extreme Value Theory for dynamical systems has found a plethora of applications in climate science and other fields. This method is very convinient for data applications since it does only require one realisation of the process to work, however it is based on an abstract proof which does not give information of which kind of systems verify its mathematical requirements. Here try to verify these properties for some systems, analytically when possible, numerically when not, and to study the reliability of this method in the absence of some of this properties.

Richard Everitt

Ensemble Kalman inversion approximate Bayesian computation

Approximate Bayesian computation (ABC) is the most popular approach to inferring parameters in the case where the data model is specified in the form of a simulator. It is not possible to directly implement standard Monte Carlo methods for inference in such a model, due to the likelihood not being available to evaluate pointwise. The main idea of ABC is to perform inference on an alternative model with an approximate likelihood, sometimes known as the ABC likelihood. The ABC likelihood is chosen such that an unbiased estimator of it is easy to construct from simulations from the data model, allowing the use of pseudo-marginal Monte Carlo algorithms for inference under the approximate model. The central challenge of ABC is then to trade-off bias (introduced by approximating the model) with the variance introduced by estimating the ABC likelihood. Stabilising the variance of the ABC likelihood requires a computational cost that is exponential in the dimension of the data, thus the most common approach to reducing variance is to perform inference conditional on summary statistics. In this talk we introduce a new approach to estimating the ABC likelihood: using ensemble Kalman inversion (EnKI). Ensemble Kalman algorithms are Monte Carlo approximations of Bayesian inference for linear/Gaussian models. These methods are often applied outside of the linear/Gaussian setting being used, for example, as an alternative to particle filtering for inference in non-linear state space models. Loosely speaking, EnKI can be used as an alternative to an SMC sampler on a sequence of annealed likelihoods. We see that EnKI has some appealing properties when used to estimate the ABC likelihood. It circumvents the exponential scaling with dimension of standard ABC, and does not require the reparameterisation imposed by the rare-event SMC approach of Prangle et al. (2018). It is able to achieve this with no additional simulations from the data model, thus it is likely to bring the most benefit in cases where this model is very expensive to simulate.

Tobias Grafke

Transition path sampling for noise induced tipping

Tipping points and rare transitions in metastable systems are a notoriously hard problem in complex system science and climate science. Noise induced tipping, where stochasticity pushes the system over an activation barrier into a novel long-lived state, fall into the purview of transition path sampling. In particular, we are often interested in drawing from the ensemble of transition paths to investigate typical tipping trajectories and their physical properties. In this talk, I will present novel mechanism for sampling the transition path ensemble even in the presence of multiple coexisting tipping mechanisms, by combining pathspace Markov chain Monte Carlo with metadynamics.

Frank Kwasniok

Linking large-scale atmospheric regimes with local extreme weather events using advanced clustering techniques

Nicholas Leach

Predicting the counterfactual: opportunities and challenges of forecast-based attribution

Valerio Lucarini

Fingerprinting Persistent Extreme Events using Large Deviation Theory

Elucidating the statistical properties of extreme meteo-climatic events and capturing the physical processes responsible for their occurrence are key steps for improving our understanding of climate variability and climate change and for better evaluating the associated hazards. It has recently become apparent that large deviation theory (LDT) is very useful for investigating persistent extreme events, and specifically, for flexibly estimating long return periods and for introducing a notion of dynamical typicality. Using a methodological framework based on LDT and taking advantage of long simulations by a state-of-the-art Earth system model, we investigate the 2021 Western North America summer heatwave. Indeed, our analysis shows that the 2021 event can be seen as an unlikely but possible manifestation of climate variability, whilst its probability of occurrence is greatly amplified by the ongoing climate change. We also clarify the properties of spatial coherence of the 2021 heatwave and elucidate the role played by the Rocky Mountains in modulating hot, dry, and persistent extreme events in the Western Pacific region of North America.

Claudia Neves

An off-shoot of the Hill estimator for threshold exceedances of block maxima

In this talk,we introduce a sub-domain of attraction for heavy-tailed distributions that offers a smooth transition between two separate and well-established approaches in the statistical analysis of extreme events: Peaks over Threshold (POT) and Block Maxima (BM). The conditions of regular variation which completely characterise this golden sub-domain of the classical Fréchet-domain of attraction surrender the Hill estimator as the natural choice for estimating the extreme value index pertaining to the unknown initial distribution function F, even when the available sample consists of block maxima (or annual maxima, in Gumbel's terminology) stemming from blocks of small size. In order to derive the asymptotic normality of the proposed Hill estimator, now equipped with a carefully chosen shift in location, we develop a new representation for its underpinning tail empirical process with respect to a weighted uniform norm. The Gaussian process arising as limit is Brownian motion. Although unsurprising for a POT-related approach, this strikes a clear difference to the Brownian bridge typically associated with nonparametric inference based on BM.

Marilena Oltmanns

The ocean's role in weather and climate extremes

Christian Rohrbeck

Applications of the Tail Pairwise Dependence Matrix

The tail pairwise dependence matrix (TPDM) has proven successful for summarising and analysing multivariate extremes distributions using ideas from principal component analysis. In this talk, we will describe how this construct can be used to (1) generate samples from the tail of multivariate distribution and (2) test for a change in the dependence structure. We illustrate these methods using simulated data, sea surface temperature data from the Red Sea and UK river flow.

Joran Rolland

Using learnt stochastic model to study the regime change of a bistable wake

When two parallel bars of thickness H, in a flow at Reynolds number R=HU/nu ~ 10000 (where U is the incoming velocity and nu the kinematic viscosity), are brought close enough (as measured by the Gap ratio G/H, where G is the distance between bar centres), the wakes of the two bars merge and the jet coming out of the two bars is no more fluttering symmetrically but can point sideways toward either of the bars. Owing to the turbulent environment, from the incoming turbulence to the turbulent wake, the jet direction can abruptly change, rendering this flow bistable. Given the simplicity of the configuration, this flow is thus a good model configurations to test methods of study of bistability. I will present the result of processing of data sampled by PIV in the LMFL windtunnel during three campaigns. If time permits, I will also present preliminary numerical modelling using URANS. I will present the regime changes, from symmetric to bistable to tristable as the gap ratio is decreased. They can be characterised using simple statistical quantities such as the weighted position of the jet, or time series of the leading POD modes and their PDF, as well as the mean first passage time before a change of direction of the, jet. However, using a stochastic model discovery method, that can select an analytical form for the model and the coefficients in it is also enlightening for the study of the bistability and the change of regimes. Indeed, from a given value of the control parameters, the regime type is contained in the select terms in the model, the sign of the coefficients and their amplitude. This procedure also show some ability to capture the main features of the system from short time series. Quantities such as the mean first passage time usually requiring long time series to be estimated with precision can then be approximated from resimulations of the model. This work was done in collaboration with Luis Baiza, Antoine Barlet, Pierre Bragança, Jiangang Chen, Christophe Cuvier, Matteo Gatti, Indra Kanshana.

Ted Shepherd

Storylines of high-impact climate events

Unprecedented high-impact climate events are becoming a frequent occurrence, and invariably attract media coverage. The scientific attribution statement that supports the media report is usually based on a semi-operational rapid attribution analysis using extreme value theory. Yet increasingly, the event is found to be impossible even in today’s climate, not only in pre-industrial climate, raising questions about the reliability of the attribution method. Moreover, the connection to the human impact of the event (which is what underlies the media coverage) is not clear. How can scientists make sense of an unprecedented high-impact climate event that has just happened? In my talk I will discuss some of the challenges raised by this question.