Observability and sensitivity in learning dynamical systems

Parameter identifiability in dynamical models

November 9, 2020 — November 9, 2020

dynamical systems
Markov processes
regression
signal processing
statistics
statmech
time series
Figure 1

How precisely can I learn a given parameter of a dynamical system from observation? In ODE theory a useful concept is sensitivity analysis, which tells us how much gradient information our observations give us about a parameter. This comes in local (at my current estimate) and global (for all parameter ranges) flavours

In linear systems theory the term observability is used to discuss whether we can in fact identify a parameter or a latent state, which I will conflate for the current purposes.

Sometimes learning a parameter as such is a red herring; we in fact wish to learn an object which function of parameters, such as a transfer function, and many different parameter combinations will approximate the object similarly well. If we know that the actual object of interest is, we might hope to integrate out the nuisance parameters and detect sensitivity to this object itself; but maybe we do not even know that. Then what do we do?

1 Ergodicity

The contact between ergodic theorems and statistical identifiability.

2 References

Borgonovo, Castaings, and Tarantola. 2012. Model Emulation and Moment-Independent Sensitivity Analysis: An Application to Environmental Modelling.” Environmental Modelling & Software, Emulation techniques for the reduction and sensitivity analysis of complex environmental models,.
Castaings, Dartus, Le Dimet, et al. 2009. Sensitivity Analysis and Parameter Estimation for Distributed Hydrological Modeling: Potential of Variational Methods.” Hydrology and Earth System Sciences.
D’Amour, Heller, Moldovan, et al. 2020. Underspecification Presents Challenges for Credibility in Modern Machine Learning.” arXiv:2011.03395 [Cs, Stat].
Douc, Roueff, and Sim. n.d. Necessary and Sufficient Conditions for the Identifiability of Observation-Driven Models.” Journal of Time Series Analysis.
Eisenberg, and Hayashi. 2013. Determining Structurally Identifiable Parameter Combinations Using Subset Profiling.” arXiv:1307.2298 [q-Bio].
Hardt, Ma, and Recht. 2018. Gradient Descent Learns Linear Dynamical Systems.” The Journal of Machine Learning Research.
Iooss, and Lemaître. 2015. A Review on Global Sensitivity Analysis Methods.” In Uncertainty Management in Simulation-Optimization of Complex Systems: Algorithms and Applications. Operations Research/Computer Science Interfaces Series.
Morio. 2011. Global and Local Sensitivity Analysis Methods for a Physical System.” European Journal of Physics.
Raue, Kreutz, Maiwald, et al. 2009. Structural and Practical Identifiability Analysis of Partially Observed Dynamical Models by Exploiting the Profile Likelihood.” Bioinformatics.
Reiersol. 1950. Identifiability of a Linear Relation Between Variables Which Are Subject to Error.” Econometrica.
Schumann-Bischoff, Luther, and Parlitz. 2013. Nonlinear System Identification Employing Automatic Differentiation.” Communications in Nonlinear Science and Numerical Simulation.
Thrun, Langford, and Fox. 1999. Monte Carlo Hidden Markov Models: Learning Non-Parametric Models of Partially Observable Stochastic Processes.” In Proceedings of the International Conference on Machine Learning.