dot
Detailansicht
Katalogkarte GBA
Katalogkarte ISBD
Suche präzisieren
Drucken
Download RIS
Hier klicken, um den Treffer aus der Auswahl zu entfernen
Titel Using Very Large Ensembles, Storm Tracking and Volunteer Computing to Validate a Climate Model for Detection and Attribution Studies.
VerfasserIn Neil Massey, Tolu Aina, Milo Thurston, Chris Huntingford, Dáithí Stone, Peter Stott, Myles Allen
Konferenz EGU General Assembly 2011
Medientyp Artikel
Sprache Englisch
Digitales Dokument PDF
Erschienen In: GRA - Volume 13 (2011)
Datensatznummer 250055716
 
Zusammenfassung
Detection and Attribution studies are often used to determine whether the increase in anthropogenic emissions of greenhouse gases has affected a climate variable, such as temperature or precipitation. There is particular interest, both from scientists and the media, when an extreme weather event occurs, such as flooding or a heatwave. Atmospheric General Circulation Models (AGCMs) are often used in such studies to model, at and around the time of interest, both the observed climate and a "world that could have been", that is, the climate with anthropogenic forcings removed. An optimal detection algorithm is then applied to the output of each scenario. Such studies have typically used a single model or a small ensemble of models per scenario. Work pioneered at Oxford by Pardeep Pall and the climateprediction.net team used a very large ensemble (of around 1000 members per scenario) of AGCMs to study the effect of anthropogenic global warming on flood risk in the United Kingdom. However, in this study no attempt was made to verify that the AGCM can accurately represent extremes of weather in the past. This study aims to improve this by simulating the climate over the past 50 years and comparing the model output to the mean and extremes of climate variables, such as temperature and precipitation, in observed datasets. The HadAM3P AGCM from the UK Met. Office is run in a time-slice manner over the period 1960 to 2007. The period is divided into two yearly, overlapping sub-periods for which forcings and initial conditions are generated. The Hadley Centre HadISST observational dataset is used for the sea surface temperatures and sea ice forcings, whereas greenhouse gas forcings are consistent with IPCC SRES A1B. A very large perturbed initial condition ensemble is computed via the climateprediction.net volunteer distributed computing network. To analyse the ensemble, techniques from forecast verification, such as reliability diagrams and the Brier Score, are used to compare the output from the ensemble to the ECMWF ERA-40 reanalysis data. The analysed ensemble consists of around 3000 members per time-slice, with a total ensemble size over the entire 47 year period of around 135000, and the data is analysed globally and for selected regions. Good correspondence is found for the mean and extreme values of climate variables between the ensemble and ERA-40, especially for precipitation over the North Atlantic and Western Europe region. Further to the above analysis, an objective storm identification and tracking algorithm is applied to a smaller, randomly chosen subset of the ensemble. The spatial distribution of storms over the North Atlantic is studied, along with the intensity of the storms and length and persistence of the track. These tracks are then compared to the tracks obtained by applying the same analysis to ERA-40 data. Results show a good spatial correlation between the storm tracks in the ensemble and those in ERA-40. The good correspondence between the ensemble data and ERA-40 lends confidence to using HadAM3P as a model for Detection and Attribution studies of the change in extreme weather risk due to anthropogenic greenhouse gas emissions.