AN INFORMATIC APPROACH TO A LONG MEMORY STATIONARY PROCESS*

  • Yiming DING ,
  • Liang WU ,
  • Xuyan XIANG
Expand
  • 1. College of Science, Wuhan University of Science and Technology, Wuhan 430081, China;
    2. Hubei Province Key Laboratory of System Science in Metallurgical Process, Wuhan University of Science and Technology, Wuhan 430065, China;
    3. Center of Statistical Research, School of Statistics, Southwestern University of Finance and Economics, Chengdu 611130, China;
    4. Hunan Province Cooperative Innovation Center for TCDDLEEZ, School of Mathematics and Physics Science, Hunan University of Arts and Science, Changde 415000, China
Yiming DING, E-mail: dingym@wust.edu.cn; Liang WU, E-mail: wuliang@swufe.edu.cn

Received date: 2022-04-26

  Revised date: 2023-05-25

  Online published: 2023-12-08

Supported by

This work was supported by the Scientific Research Foundation for the Returned Overseas Chinese Scholars of State Education Ministry, the Key Scientific Research Project of Hunan Provincial Education Department (19A342), the National Natural Science Foundation of China (11671132, 61903309 and 12271418), the National Key Research and Development Program of China (2020YFA0714200), Sichuan Science and Technology Program (2023NSFSC1355), and the Applied Economics of Hunan Province. All authors are co-first authors of the article.

Abstract

Long memory is an important phenomenon that arises sometimes in the analysis of time series or spatial data. Most of the definitions concerning the long memory of a stationary process are based on the second-order properties of the process. The mutual information between the past and future $I_{p-f}$ of a stationary process represents the information stored in the history of the process which can be used to predict the future. We suggest that a stationary process can be referred to as long memory if its $I_{p-f}$ is infinite. For a stationary process with finite block entropy, $I_{p-f}$ is equal to the excess entropy, which is the summation of redundancies that relate the convergence rate of the conditional (differential) entropy to the entropy rate. Since the definitions of the $I_{p-f}$ and the excess entropy of a stationary process require a very weak moment condition on the distribution of the process, it can be applied to processes whose distributions are without a bounded second moment. A significant property of $I_{p-f}$ is that it is invariant under one-to-one transformation; this enables us to know the $I_{p-f}$ of a stationary process from other processes. For a stationary Gaussian process, the long memory in the sense of mutual information is more strict than that in the sense of covariance. We demonstrate that the $I_{p-f}$ of fractional Gaussian noise is infinite if and only if the Hurst parameter is $H \in (1/2, 1)$.

Cite this article

Yiming DING , Liang WU , Xuyan XIANG . AN INFORMATIC APPROACH TO A LONG MEMORY STATIONARY PROCESS*[J]. Acta mathematica scientia, Series B, 2023 , 43(6) : 2629 -2648 . DOI: 10.1007/s10473-023-0619-0

References

[1] Ahsen M E, Vidyasagar M.On the computation of mixing coefficients between discrete-valued random variables. The 9th Asian Control Conference (ASCC), IEEE, 2013: 1-5
[2] Ahsen M E, Vidyasagar M. Mixing coefficients between discrete and real random variables: Computation and properties. IEEE Transactions on Automatic Control, 2014, 59(1): 34-47
[3] Barron A R. Entropy and the central limit theorem. The Annals of Probability, 1986, 14(1): 336-342
[4] Beran J.Statistics for Long-Memory Processes. Boca Raton: CRC Press, 1994
[5] Beran J, Feng Y, Ghosh S, Kulik R.Long-Memory Processes Probabilistic Properties and Statistical Methods. Heidelberg: Springer, 2013
[6] Cover T M, Thomas J A.Elements of Information Theory. New Jersey: John Wiley & Sons, 2006
[7] Crutchfield J P, Ellison C J, Mahoney J R. Time's barbed arrow: Irreversibility, crypticity,stored information. Physical Review Letters, 2009, 103(9): 094101
[8] Crutchfield J P, Feldman D P. Regularities unseen, randomness observed: Levels of entropy convergence. Chaos: An Interdisciplinary Journal of Nonlinear Science, 2003, 13(1): 25-54
[9] Cui H, Ding Y. The convergence of the rényi entropy of the normalized sums of iid random variables. Statistics & Probability Letters, 2010, 80(15/16): 1167-1173
[10] Darmon D. Specific differential entropy rate estimation for continuous-valued time series. Entropy, 2016, 18(5): 190
[11] Davis R A, Mikosch T, Zhao Y. Measures of serial extremal dependence and their estimation. Stochastic Processes and their Applications, 2013, 123(7): 2575-2602
[12] Drożdż S, Oświęcimka P, Kulig A, et al. Quantifying origin and character of long-range correlations in narrative texts. Information Sciences, 2016, 331: 32-44
[13] Granger C W J, Joyeux R.An introduction to long-memory time series models and fractional differencing. Journal of Time Series Analysis, 1980, 1(1): 15-29
[14] Guégan D. How can we define the concept of long memory? An econometric survey. Econometric Reviews, 2005, 24(2): 113-149
[15] Hurst H E. Long-term storage capacity of reservoirs. Transactions of the American Society of Civil Engineers, 1951, 116: 770-799
[16] Ihara S.Information Theory for Continuous Systems, volume 2. Singapore: World Scientific, 1993
[17] Kolmogorov A N. Wienersche spiralen und einige andere interessante kurven in hilbertscen raum (in German), Translation: (Tikhomirov, 1991). Comptes Rendus (Doklady), 1940, 26: 115-118
[18] Li L M. Some notes on mutual information between past and future. Journal of Time Series Analysis, 2006, 27(2): 309-322
[19] Samorodnitsky G.Long range dependence, heavy tails and rare events. Report, Maphysto, Centre for Mathematical Physics and Stochastics, Aarhus, 2002
[20] Samorodnitsky G. Extreme value theory, ergodic theory and the boundary between short memory and long memory for stationary stable processes. The Annals of Probability, 2004, 32(2): 1438-1468
[21] Samorodnitsky G. Long range dependence. Foundations and Trends in Stochastic Systems, 2006, 1(3): 163-257
[22] Samorodnitsky G.Stochastic Processes and Long Range Dependence, volume 26. Switzerland: Springer, 2016
[23] Shannon C E. A mathematical theory of communication. The Bell System Technical Journal, 1948, 27(3): 379-423
[24] Sinai Y G. Self-similar probability distributions. Theory of Probability & Its Applications, 1976, 21(1): 64-80
[25] Stein E M, Shakarchi R.Fourier analysis: An introduction, volume 1. Princeton and Oxford: Princeton University Press, 2003
[26] Sun J, Ding Y. Rate of convergence and expansion of Rényi entropic central limit theorem. Acta Mathematica Scientia, 2015, 35B(1): 79-88
[27] Travers N F, Crutchfield J P. Infinite excess entropy processes with countable-state generators. Entropy, 2014, 16(3): 1396-1413
[28] Wu L, Ding Y. Wavelet-based estimator for the Hurst parameters of fractional Brownian sheet. Acta Mathematica Scientia, 2017, 37B(1): 205-222
[29] Xu W, Zhang G, Bai B, et al.Ten key ICT challenges in the post-Shannon era(in Chinese). Scientia Sinica Mathematica, 2021, 51(7): 1095-1138
Options
Outlines

/