The Living Thing / Notebooks :

State space reconstruction

Disclaimer: I know next to nothing about this.

But I think it’s something like: Looking at the data from a, possibly stochastic, dynamical system. and hoping to infer cool things about the kinds of hidden states it has, in some general sense, such as some measure of statistical of computational complexity, or how complicated or “large” the underlying state space, in some convenient representation, is.

TBH I don’t understand this framing, but possibly because I don’t come from a dynamical systems group; I just dabble in special cases thereof. Surely you either do physics, and work out the dynamics of your system from experiment, or you do statistics and select an appropriate model to minimise some estimated predictive loss trading off data set, model complexity and algorithmic complexity. I need to read more to understand the rationale here, clearly.

Anyway, tools here seem to include inventing large spaces of hidden states (Takens embedding); does this get us some nice algebraic properties? Also, how does delay embedding relate? Is that the same? (Further bonus question – did SiJR04 rediscover delay embedding, or have they extended it?) Sample complexity results here seem to be scanty, possibly because they usually want their chaos to be deterministic and admitting noise would be fiddly.

OTOH, from a statistical perspective there are lots of useful techniques to infer special classes of dynamical systems state-space, even with nonlinear dynamics. e.g. in plain old model-based count time series such as branching processes, and grammatical inference of formal syntax, and nonlinear system identification.

I would be interested to see a compelling new insight from the dynamical system perspective on these problems. New estimators; models outside the ken of Kalman filters?

Stuff that I might actually use

Hirata’s reconstruction looks like good clean decorative fun – you can represent graphs by an equivalent dynamical system.

Refs

Badii, Remo, and Antonio Politi. 1999. Complexity: Hierarchical Structures and Scaling in Physics. Cambridge Nonlinear Science Series. Cambridge University Press.

Charles, Adam, Aurele Balavoine, and Christopher Rozell. 2016. “Dynamic Filtering of Time-Varying Sparse Signals via L1 Minimization.” IEEE Transactions on Signal Processing 64 (21): 5644–56. https://doi.org/10.1109/TSP.2016.2586745.

Crutchfield, James P., and Karl Young. 1989. “Inferring Statistical Complexity.” Physical Review Letters 63 (2): 105–8. https://doi.org/10.1103/PhysRevLett.63.105.

Dupont, P, François Denis, and Y Esposito. 2005. “Links Between Probabilistic Automata and Hidden Markov Models: Probability Distributions, Learning Models and Induction Algorithms.” Pattern Recognition 38: 1349–71. https://doi.org/10.1016/j.patcog.2004.03.020.

Foote, Jonathan. 1999. “Visualizing Music and Audio Using Self-Similarity.” In Proceedings of the Seventh ACM International Conference on Multimedia (Part 1), 77–80. MULTIMEDIA ’99. New York, NY, USA: ACM. https://doi.org/10.1145/319463.319472.

Grassberger, Peter, Thomas Schreiber, and C Schaffrath. 1991. “Nonlinear Time Sequence Analysis.” International Journal of Bifurcation and Chaos 1 (3): 521–47. https://doi.org/10.1142/S0218127491000403.

Hamilton, Franz, Tyrus Berry, and Timothy Sauer. 2016. “Kalman-Takens Filtering in the Presence of Dynamical Noise,” November. http://arxiv.org/abs/1611.05414.

Hirata, Yoshito, Shunsuke Horai, and Kazuyuki Aihara. 2008. “Reproduction of Distance Matrices and Original Time Series from Recurrence Plots and Their Applications.” The European Physical Journal Special Topics 164 (1): 13–22. https://doi.org/10.1140/epjst/e2008-00830-8.

Hirata, Yoshito, Hideyuki Suzuki, and Kazuyuki Aihara. 2006. “Reconstructing State Spaces from Multivariate Data Using Variable Delays.” Physical Review E 74 (2): 026202. https://doi.org/10.1103/PhysRevE.74.026202.

Kantz, Holger, and Thomas Schreiber. 2004. Nonlinear Time Series Analysis. 2nd ed. Cambridge, UK ; New York: Cambridge University Press.

Levin, David N. 2017. “The Inner Structure of Time-Dependent Signals,” March. http://arxiv.org/abs/1703.08596.

Marwan, N. 2008. “A Historical Review of Recurrence Plots.” The European Physical Journal Special Topics 164 (1): 3–12. https://doi.org/10.1140/epjst/e2008-00829-1.

Shalizi, Cosma Rohilla, and James P Crutchfield. 2000. “Computational Mechanics: Pattern and Prediction, Structure and Simplicity.”

Shalizi, Cosma Rohilla, and James P. Crutchfield. 2002. “Information Bottlenecks, Causal States, and Statistical Relevance Bases: How to Represent Relevant Information in Memoryless Transduction.” Advances in Complex Systems 05 (01): 91–95. https://doi.org/10.1142/S0219525902000481.

Shalizi, Cosma Rohilla, and Kristina Lisa Shalizi. 2004. “Blind Construction of Optimal Nonlinear Recursive Predictors for Discrete Sequences.” In, 504–11.

Sugihara, George, Robert May, Hao Ye, Chih-hao Hsieh, Ethan Deyle, Michael Fogarty, and Stephan Munch. 2012. “Detecting Causality in Complex Ecosystems.” Science 338 (6106): 496–500. https://doi.org/10.1126/science.1227079.