dot
Detailansicht
Katalogkarte GBA
Katalogkarte ISBD
Suche präzisieren
Drucken
Download RIS
Hier klicken, um den Treffer aus der Auswahl zu entfernen
Titel A dynamically consistent reconstruction of surface temperature changes during the last 600 years based on climate model simulations using data assimilation.
VerfasserIn H. Goosse, E. Crespin, M. E. Mann, H. Renssen, A. Timmermann
Konferenz EGU General Assembly 2009
Medientyp Artikel
Sprache Englisch
Digitales Dokument PDF
Erschienen In: GRA - Volume 11 (2009)
Datensatznummer 250020465
 
Zusammenfassung
Many estimates of the surface temperature changes in the past millennium are now available from proxy-based reconstructions using statistical methods as well as from simulations with global climate models. Both approaches have their own strengths and limitations. On the one hand, the results of climate models are strongly dependant on the forcings applied as well as the model physics (in particular the climate sensitivity and the efficiency of heat uptake). Furthermore, models cannot reproduce the time evolution of the internal variability of the system because of the chaotic nature of this type of variability: the best the models can do is to reproduce the statistics of the variability, if they include the right physics. On the other hand, reconstructions based on proxy records depend on the ability of the proxy records to effectively track climate changes and on the assumptions made in the method. Furthermore, there is no guarantee that the patterns provided by the method are consistent with the dynamics of the real climate system. Both model simulations and statistical reconstructions have provided very interesting results over the last few years. Here, the goal is to show how additional, complementary information could be achieved by combining model results and proxy data through data assimilation in the climate model LOVECLIM. In this framework, the model is forced to follow temperature histories obtained from a recent compilation of well-calibrated surface temperature proxies. This is achieved using a simple data assimilation technique that could be briefly described as follows. For each year, a large ensemble of simulations is performed (96 here). The member of the ensemble that is the closest to observations is then selected as representative for this particular year and used as the initial condition for the subsequent year. The distance between the model results and the proxy record is measured by a cost function using reconstructed and simulated temperatures at the locations where the proxies are available. The best simulation retained is the one that minimizes this cost function. The technique provides then a continuous record over the past millennium that is compatible with model physics, with the forcing applied and with the available proxy records. In a first step, we will show that the technique effectively allows a good representation of the signal recorded by the proxies in the Northern Hemisphere for the last 600 years. Secondly, by using different forcings and different model parameters (leading to different climate sensitivities), we will show that, thanks to the data assimilation, all the simulations provide results that are very similar. This implies that the results are much less dependent to the selected model parameters and forcing compared to simulations without data assimilation and thus more robust. Those model results with data assimilation are very similar to a reconstruction of northern hemisphere temperature based on the same proxy data and using a classical statistical method, increasing the confidence in this reconstruction. The ability of reconstructing regional temperature changes from our simulation with data assimilation will then be demonstrated in land areas of the Northern Hemisphere where enough data is available. By contrast, our reconstruction is not yet useful over the Atlantic and Pacific Oceans, likely because of the lack of data in those regions.