dot
Detailansicht
Katalogkarte GBA
Katalogkarte ISBD
Suche präzisieren
Drucken
Download RIS
Hier klicken, um den Treffer aus der Auswahl zu entfernen
Titel Three decades of harnessing the GPS data explosion for geophysics (Vening Meinesz Medal Lecture)
VerfasserIn Geoffrey Blewitt
Konferenz EGU General Assembly 2015
Medientyp Artikel
Sprache Englisch
Digitales Dokument PDF
Erschienen In: GRA - Volume 17 (2015)
Datensatznummer 250114575
Publikation (Nr.) Volltext-Dokument vorhandenEGU/EGU2015-15364.pdf
 
Zusammenfassung
In this presentation, I attempt to convey the immensity of the task that faced the geodesy community three decades ago, and continues to challenge us, to harness all potentially valuable GPS data available in the world for geophysical science. It would be fair to see that three decades ago, we were struggling with controlled tests just to get GPS geodesy working, and had little time to imagine the flood of data today. Yet the geodesy community has succeeded in meeting this challenge. Today, for example, the Nevada Geodetic Laboratory produces and makes publicly available coordinate time series for over 12,000 geodetic GPS station around the globe with various data intervals, latencies, and reference frames. About 8,000 stations have their daily time series updated every week, with 4,000 being updated the next day with coordinates at daily and 5 minute intervals. About 2,000 stations have their time series updated every hour with coordinates at 5 minute intervals. I will show examples of how these time series are being used by NGL and many other scientists to study a wide variety of geophysical topics, including plate tectonics, earthquake modeling, seismic and tsunami hazard, volcanic deformation, water resources, mountain growth, terrestrial reference frame realization, glacial isostatic adjustment, ice sheet melting, sea level rise and coastal subsidence, and even fundamental physics, using GPS atomic clocks to probe the nature of dark matter in the universe. The explosion in GPS data has challenged us to invent new data processing algorithms and develop robust automation in order to keep up with the flood. This explosion has been exponential, and therefore it can be said that it is not a recent phenomena, but rather that it began in the earliest years of GPS geodesy, and has always posed a challenge to us. Over the course of my post-doctoral career starting in late 1985, I have had the good fortune to witness the key developments that have taken place since the early years of geodetic GPS and over the course of three decades. These developments continue today as strongly as ever. Essential innovations have included, for example, automation of GPS cycle slip detection and mitigation, carrier phase ambiguity resolution, the birth and operation of the IGS for reliable orbit and clock estimation, the invention of algorithms that scale linearly with the number of stations, and the deep integration of GPS solutions into the ITRF, providing measures of accuracy, precision, and stability. As a recent example of automation, I show a new non-parametric algorithm to estimate station velocities quickly and robustly, without need to detect and correct for outliers, seasonal signals, and discontinuities in the time series steps that commonly occur due to equipment changes. The complete automation from data collection to production of station velocities (and, now, velocity time series) allows us to process all potentially valuable data, and to focus more on discovery and analysis of the results for geophysical applications, often with great redundancy in the data leading to high statistical significance and more robust scientific conclusions. I show by example that another benefit of this capability to process all data in a robust turn-key fashion is to enhance the opportunity for making discoveries, without necessarily planning all of the steps that can lead us to discovery’s door.