next up previous
Next: False nearest neighbors Up: Embedding parameters Previous: Embedding parameters

Mutual information

The time delayed mutual information was suggested by Fraser and Swinney [25] as a tool to determine a reasonable delay: Unlike the autocorrelation function, the mutual information takes into account also nonlinear correlations. One has to compute
 equation4425
where for some partition on the real numbers tex2html_wrap_inline6569 is the probability to find a time series value in the i-th interval, and tex2html_wrap_inline6573 is the joint probability that an observation falls into the i-th interval and the observation time tex2html_wrap_inline6553 later falls into the j-th. In theory this expression has no systematic dependence on the size of the partition elements and can be quite easily computed. There exist good arguments that if the time delayed mutual information exhibits a marked minimum at a certain value of tex2html_wrap_inline6553, then this is a good candidate for a reasonable time delay. However, these arguments have to be modified when the embedding dimension exceeds two. Moreover, as will become transparent in the following sections, not all applications work optimally with the same delay. Our routine mutual uses Eq.(gif), where the number of boxes of identical size and the maximal delay time has to be supplied. The adaptive algorithm used in [25] is more data intensive. Since we are not really interested in absolute values of the mutual information here but rather in the first minimum, the minimal implementation given here seems to be sufficient. The related generalized mutual information of order two can be defined using the correlation sum concept (Sec.gif, [26, 27]). Estimation of the correlation entropy is explained in Sec.gif.


next up previous
Next: False nearest neighbors Up: Embedding parameters Previous: Embedding parameters

Thomas Schreiber
Wed Jan 6 15:38:27 CET 1999