Küstenforschung

Publications

Following publications have been announced by our Institute of Coastal Systems – Analysis and Modeling. For further information please contact the marked authors of the publications:

 

Krieger, D., Brune, S., Pieper, P., Weisse, R., & Baehr, J. (2022): Skillful decadal prediction of German Bight storm activity. Nat. Hazards Earth Syst. Sci., 22, 3993–4009, doi:10.5194/nhess-22-3993-2022

Abstract:

We evaluate the prediction skill of the Max Planck Institute Earth System Model (MPI-ESM) decadal hindcast system for German Bight storm activity (GBSA) on a multiannual to decadal scale. We define GBSA every year via the most extreme 3-hourly geostrophic wind speeds, which are derived from mean sea-level pressure (MSLP) data. Our 64-member ensemble of annually initialized hindcast simulations spans the time period 1960–2018. For this period, we compare deterministically and probabilistically predicted winter MSLP anomalies and annual GBSA with a lead time of up to 10 years against observations. The model produces poor deterministic predictions of GBSA and winter MSLP anomalies for individual years but fair predictions for longer averaging periods. A similar but smaller skill difference between short and long averaging periods also emerges for probabilistic predictions of high storm activity. At long averaging periods (longer than 5 years), the model is more skillful than persistence- and climatology-based predictions. For short aggregation periods (4 years and less), probabilistic predictions are more skillful than persistence but insignificantly differ from climatological predictions. We therefore conclude that, for the German Bight, probabilistic decadal predictions (based on a large ensemble) of high storm activity are skillful for averaging periods longer than 5 years. Notably, a differentiation between low, moderate, and high storm activity is necessary to expose this skill.

 

Zhang, Z., Wagner, S., Klockmann, M., & Zorita, E. (2022): Evaluation of statistical climate reconstruction methods based on pseudoproxy experiments using linear and machine-learning methods. Clim. Past, 18, 2643–2668, doi:10.5194/cp-18-2643-2022

Abstract:

Three different climate field reconstruction (CFR) methods are employed to reconstruct spatially resolved North Atlantic–European (NAE) and Northern Hemisphere (NH) summer temperatures over the past millennium from proxy records. These are tested in the framework of pseudoproxy experiments derived from two climate simulations with comprehensive Earth system models. Two of these methods are traditional multivariate linear methods (principal component regression, PCR, and canonical correlation analysis, CCA), whereas the third method (bidirectional long short-term memory neural network, Bi-LSTM) belongs to the category of machine-learning methods. In contrast to PCR and CCA, Bi-LSTM does not need to assume a linear and temporally stable relationship between the underlying proxy network and the target climate field. In addition, Bi-LSTM naturally incorporates information about the serial correlation of the time series. Our working hypothesis is that the Bi-LSTM method will achieve a better reconstruction of the amplitude of past temperature variability. In all tests, the calibration period was set to the observational period, while the validation period was set to the pre-industrial centuries. All three methods tested herein achieve reasonable reconstruction performance on both spatial and temporal scales, with the exception of an overestimation of the interannual variance by PCR, which may be due to overfitting resulting from the rather short length of the calibration period and the large number of predictors. Generally, the reconstruction skill is higher in regions with denser proxy coverage, but it is also reasonably high in proxy-free areas due to climate teleconnections. All three CFR methodologies generally tend to more strongly underestimate the variability of spatially averaged temperature indices as more noise is introduced into the pseudoproxies. The Bi-LSTM method tested in our experiments using a limited calibration dataset shows relatively worse reconstruction skills compared to PCR and CCA, and therefore our working hypothesis that a more complex machine-learning method would provide better reconstructions for temperature fields was not confirmed. In this particular application with pseudoproxies, the implied link between proxies and climate fields is probably close to linear. However, a certain degree of reconstruction performance achieved by the nonlinear LSTM method shows that skill can be achieved even when using small samples with limited datasets, which indicates that Bi-LSTM can be a tool for exploring the suitability of nonlinear CFRs, especially in small data regimes.

 

Erikson, L., Morim, J., Hemer, M., Young, I., Wang, X.L., Mentaschi, L., Mori, N., Semedo, A., Stopa, J., Grigorieva, V., Gulev, S., Aarnes, O., Bidlot, J.-R., Breivik, Ø., Bricheno, L., Shimura, T., Menendez, M.,Markina, M., Sharmar, V., Trenham, C., Wolf, J., Appendini, C., Caires, S.,  Groll, N., & Webb, A. (2022): Global ocean wave fields show consistent regional trends between 1980 and 2014 in a multi-product ensemble. Commun Earth Environ 3, 320, doi:10.1038/s43247-022-00654-9

Abstract:

Historical trends in the direction and magnitude of ocean surface wave height, period, or direction are debated due to diverse data, time-periods, or methodologies. Using a consistent community-driven ensemble of global wave products, we quantify and establish regions with robust trends in global multivariate wave fields between 1980 and 2014. We find that about 30–40% of the global ocean experienced robust seasonal trends in mean and extreme wave height, period, and direction. Most of the Southern Hemisphere exhibited strong upward-trending wave heights (1–2 cm per year) and periods during winter and summer. Ocean basins with robust positive trends are far larger than those with negative trends. Historical trends calculated over shorter periods generally agree with satellite records but vary from product to product, with some showing a consistently negative bias. Variability in trends across products and time-periods highlights the importance of considering multiple sources when seeking robust change analyses.

 

Kousal, J., Liu, Q., Staneva, J., Behrens, A., Günther, H., Bidlot, J.-R., & Babanin, A.V. (2022): Introducing Observation-Based Physics Into the WAM Wave Model. Proceedings of the ASME 2022 41st International Conference on Ocean, Offshore and Arctic Engineering. Volume 2: Structures, Safety, and Reliability. Hamburg, Germany. June 5–10, 2022, V002T02A005, ASME, doi:10.1115/OMAE2022-79652

Abstract:

Third generation spectral wave models predict the growth, decay and transformation of wind-generated waves. WAM is one of the two dominant third-generation spectral wave models covering the global ocean. The standard representation of deep-water wave physics in WAM is the semi-empirical approach of Ardhuin et al. (2010; hereafter ARD). The other modern package of wave physics for spectral models, available in WAVEWATCH-III and SWAN, but not in WAM, is the observation-based approach developed by A. V. Babanin, I. R. Young, M. A. Donelan, M. L. Banner and W. E. Rogers (hereafter BYDBR). Its source functions were measured and therefore are not subject to tuning (within confidence limits of the measurements). Furthermore, the observations revealed and/or quantified physical phenomena missing or neglected in other source packages for spectral wave models such as non-linear airflow separation (hence relative reduction of wind input at strong winds), steepness-dependent wave growth (making the wind input a weakly non-linear function of the wave spectrum), negative wind input under adverse winds, two-part wave dissipation consisting of a local in wavenumber space term and a cumulative term (which is an integral of the spectrum), a wave breaking threshold below which no breaking occurs, and swell decay due to interaction with and production of oceanic turbulence. This work documents the implementation of BYDBR physics into WAM. We find that the performance of WAM using BYDBR physics is comparable to the standard with respect to the mean-based metrics. The mean differences between BYDBR and ARD are most prominent in Boreal winter, with widespread reductions in the N. H. of ∼10cm and ∼0.2s for significant wave height and mean wave period respectively.

 

Paasche, H., Gross, M., Lüttgau, J., Greenberg, D.S., & Weigel, T. (2022): To the brave scientists: Aren’t we strong enough to stand (and profit from) uncertainty in Earth system measurement and modelling? Geoscience Data Journal, 9, 393–399, https://doi.org/10.1002/gdj3.132

Abstract:

The current handling of data in earth observation, modelling and prediction measures gives cause for critical consideration, since we all too often carelessly ignore data uncertainty. We think that Earth scientists are generally aware of the importance of linking data to quantitative uncertainty measures. But we also think that uncertainty quantification of Earth observation data too often fails at very early stages. We claim that data acquisition without uncertainty quantification is not sustainable and machine learning and computational modelling cannot unfold their potential when analysing complex natural systems like the Earth. Current approaches such as stochastic perturbation of parameters or initial conditions cannot quantify uncertainty or bias arising from the choice of model, limiting scientific progress. We need incentives stimulating the honest treatment of uncertainty starting during data acquisition, continuing through analysis methodology and prediction results. Computational modellers and machine learning experts have a critical role, since they enjoy high esteem from stakeholders and their methodologies and their results critically depend on data uncertainty. If both want to advance their uncertainty assessment of models and predictions of complex systems like the Earth, they have a common problem to solve. Together, computational modellers and machine learners could develop new strategies for bias identification and uncertainty quantification offering a more all-embracing uncertainty quantification than any known methodology. But since it starts for computational modellers and machine learners with data and their uncertainty, the fundamental first step in such a development would be leveraging shareholder esteem to insistently advocate for reduction of ignorance when it comes to uncertainty quantification of data.

Exit mobile version