The Living Thing / Notebooks :

Random neural networks

Recurrent: Liquid/ Echo State Machines/Random reservoir networks

This sounds deliciously lazy; Very roughly speaking, your first layer is a reservoir of random saturating IIR filters. You fit a classifier on the outputs of this - possibly even allowing the network to converged to a steady state in some sense, so that the oscillations of the reservoir are not coupled to time.

Easy to implement, that. I wonder when it actually works, constraints on topology etc.

I wonder if you can use some kind of sparsifying transform on the recurrence operator?

These claim to be based on spiky models, but AFAICT this is not at all necessary.

Various claims are made about how hard they avoid the training difficulty of similarly basic RNNs by being essentially untrained; you use them as a feature factory for another supervised output algorithm.

Suggestive parallel with random projections. Not strictly recurrent, but same general idea: HeWH16.

LuJa09 has an interesting taxonomy. (although even if the exact details were right at the time, the evolution of neural network research make them questionable now):

From a dynamical systems perspective, there are two main classes of RNNs. Models from the first class are characterized by an energy-minimizing stochastic dynamics and symmetric connections. The best known instantiations are Hopfield networks, Boltzmann machines, and the recently emerging Deep Belief Networks. These networks are mostly trained in some unsupervised learning scheme. Typical targeted network functionalities in this field are associative memories, data compression, the unsupervised modeling of data distributions, and static pattern classification, where the model is run for multiple time steps per single input instance to reach some type of convergence or equilibrium (but see e.g., TaHR06 for extension to temporal data). The mathematical background is rooted in statistical physics. In contrast, the second big class of RNN models typically features a deterministic update dynamics and directed connections. Systems from this class implement nonlinear filters, which transform an input time series into an output time series. The mathematical background here is nonlinear dynamical systems. The standard training mode is supervised.

Random convolutions

TBD.

(Particularly) Random Training

Refs