dot
Detailansicht
Katalogkarte GBA
Katalogkarte ISBD
Suche präzisieren
Drucken
Download RIS
Hier klicken, um den Treffer aus der Auswahl zu entfernen
Titel Entropy train diagrams for information-based measures of statistical association
VerfasserIn Jobst Heitzig, Jakob Runge
Konferenz EGU General Assembly 2011
Medientyp Artikel
Sprache Englisch
Digitales Dokument PDF
Erschienen In: GRA - Volume 13 (2011)
Datensatznummer 250053149
 
Zusammenfassung
Nonlinear statistical associations between geophysical processes are increasingly often analyzed with concepts from information theory, i.e., by measuring certain amounts of entropy, and often taking into account possible time-lags. We offer an intuitive visual tool that allows for easy understanding and comparison of many information-theoretic statistics, and use it to discuss existing measures and to introduce two new measures of sensitivity and coupling strength. Given two (possibly multivariate) processes X,Y , the most prominent information-theoretic association measures are the symmetric measure of mutual information I(X;Y ) and the directed measure of transfer entropy T(X -†’ Y ). Mutual information can be interpreted as that “part” of the entropy H(Xt,Y t) of the present joint state (Xt,Y t) that “is part of” both the entropy of Xt and of that of Y t. Transfer entropy can be interpreted as that amount of the entropy of Y t that “first visited” the joint system (X,Y ) at subsystem X at some past time t - k and did not “visit” subsystem Y before the present time t (but might have visited X again between t - k and t). In the special case of discrete-time Gaussian processes, such measures can typically be expressed in terms of certain (co-)variances between unlagged or lagged variables or residuals. Transfer entropy is then proportional to the linear measure of Granger “causality” and is thus influenced by possible auto-correlations in X but not by those in Y . We introduce entropy train diagrams as an intuitive way to illustrate the above kind of entropy flow interpretation. Such a diagram depicts a possible path that a part of the entropy (or a “signal”) can take through the joint system (X,Y ) and its subsystems X and Y over time. It combines a time axis with a Venn diagram that corresponds to the decomposition of the entropy H(Xt,Y t) into the sum of mutual information and two conditional entropies, H(Xt,Y t) = H(Xt|Y t) + I(Xt;Y t) + H(Y t|Xt). A simple colour-coding is used to mark mandatory, optional, and excluded places the signal “visits” at different time-points. It allows for an easy understanding and comparison of the often lengthy and formalistic definitions of information-based association measures. We first use entropy train diagrams to illustrate different decompositions of transfer entropy of discrete-time processes into lag-specific parts, which can be used to determine the probable time lags at which two systems might be coupled, or to filter for relevant lags that are predicted by some theory. Finally, we use entropy train diagrams to motivate a modification of transfer entropy in two opposing directions, resulting in new aggregate measures of non-linear association and their symmetrized and linear counterparts: Backdoor entropy reflects the aggregate effect on the present value of Y of all signals which visited X before ever visiting Y . It is possibly mediated by auto-correlation in X or Y and can thus be interpreted as a measure of “sensitivity”. Leap entropy, in contrast, reflects that part of the aggregate effect of these signals which was not mediated by auto-correlation in X or Y , and is thus a measure of “coupling strength”.