To assess ice sheet mass changes from observation data, such as from GRACE or altimetry, it
is customary to adjust simple mathematical models (e.g., linear or quadratic in time) to the
time series and to consider the adjusted model parameters as the trends. The underlying
concept may be that the geophysical process of interest (long-term trend) can be
described by such a mathematical model. In this conceptional framework, deviations of
the data from the mathematical model are noise which is either due to data errors
or due to geophysical processes that are distinct from the long-term process of
interest.
Now, since both observations and understanding of ice sheet processes have extended and
improved, it appears that (1) the relative importance of data noise has decreased, as
compared to true geophysical deviations from a mathematically simple (linear,
quadratic, etc.) temporal behavior; (2) variations and transitions in the regimes of both
surface mass balance and ice flow are going on in a wide range of time scales and are
interlinked, so that a distinction between long-term and short-term processes is not
evident.
In consequence, trends adjusted to different periods are inherently different, and the
simple (linear, quadratic, etc.) mathematical models are relegated to merely approximating a
local temporal behavior. This has consequences for the interpretation and intercomparison of
trends. For example, consider that for a 5-year interval a linear trend of -100 Gt/yr is
estimated from GRACE. Then, neither does this trend imply that the best mass change
estimate over those 5 years is -500 Gt, nor does it necessarily represent a longer-term mass
evolution. Additional complications arise when analyses are based on filtered time
series.
In this presentation, the problem is illuminated based on simple, partly synthetic,
examples. Alternative approaches are discussed to arrive at mathematical representations of
ice sheet changes adapted to both the nature of those changes and the data availability and
quality. |