The Living Thing / Notebooks :

Automatic differentiation

Getting your computer to tell you the gradient of a function, without resorting to finite difference approximation. I am mostly interested here in the sense of automatic forward or reverse mode differentiation, which is not, as such, a symbolic technique, but symbolic differentiation gets an incidental lookin.

Infinitesimal/Taylor series formulations, and closely related dual number formulations, and even fancier hyperdual formulations. Reverse-mode, a.k.a. Backpropagation, versus forward-mode etc. Computational complexity of all the above.

There is a beautiful explanation of the basics by Sanjeev Arora and Tengyu Ma.

You might want to do this for optimisation, batch or SGD, especially in neural networks, matrix factorisations, variational approximation etc. This is not news these days, but it took a stunningly long time to become common; see, e.g. Justin Domke, who claimed Automatic Differentiation to be the most criminally underused tool in the machine learning toolbox?.

See also symbolic mathematical calculators.

Software

Refs