The Living Thing / Notebooks :

Automatic differentiation

Getting your computer to tell you the gradient of a function, without resorting to finite difference approximation.

There seems to be a lot of stuff to know here; Infinitesimal/Taylor series formulations, and closely related dual number formulations, and even fancier hyperdual formulations. Reverse-mode, a.k.a. Backpropoagation, versus forward-mode etc. Computational complexity of all the above. But for special cases you can ignore most of this.

There is a beautiful explanation of the basics by Sanjeev Arora and Tengyu Ma.

You might want to do this for optimisation, batch or SGD, especially in neural networks, matrix factorisations, variational approximation etc. This is not news these days, but it took a stunningly long time to become common; see, e.g. Justin Domschke, who claimed Automatic Differentiation to be the most criminally underused tool in the machine learning toolbox?.

See also symbolic mathematical calculators.