The Living Thing / Notebooks :

Coarse graining

AFAICT, this is the question ‘how much worse do your predictions get as you discard information in some orderly fashion?’, as framed by physicists.

Do “renormalisation groups”, whatever they are, fit in here?

Persistent Homolgy

What’s that?

Petri, G., Expert, P., Turkheimer, F., Carhart-Harris, R., Nutt, D., Hellyer, P. J., & Vaccarino, F. (2014). Homological scaffolds of brain functional networks. in Journal of The Royal Society Interface, 11(101), 20140873. DOI

Talks about a fun-sounding “persistent homology” idea, which sounds a little like some kind of topological measure theory to my analytics-biassed perspective:

Persistent homology is a recent technique in computational topology
developed for shape recognition and the analysis of high dimensional
datasets [36,37].
It has been used in very diverse fields, ranging from biology [38,39] and
sensor network coverage [40] to cosmology [41].
Similar approaches to brain data [42,43], collaboration data [44] and
network structure [45] also exist.
The central idea is the construction of a sequence of successive
approximations of the original dataset seen as a topological space X.
This sequence of topological spaces $X_0, X_1, \dots{}, X_N = X$ is
such that $X_i \subseteq X_j$ whenever $i < j$ and is called the
filtration.

Canon

Refs