The Living Thing / Notebooks :

Matrix calculus

Usefulness: 🔧
Novelty: 💡
Uncertainty: 🤪 🤪 🤪
Incompleteness: 🚧 🚧 🚧

You can generalise the high school calculus, which is about scalar functions of a scalar argument, in various ways, to handle functions with a vector, or matrix value, or argument.

I will mention two convenient and popular formalisms for doing that here. In practice a mix of each is often useful.

Matrix differentials

🚧 I need to return to this and tidy it up with some examples.

A special case of tensor calculus that happens to be handy for some common cases; where the rank of the argument and value of the function is not too big. Fun pain point: agreeing upon layout of derivatives, numerator vs denominator.

If you problem is nice, this often gets you a very compact and tidy solution without much fuss. However, physicists will shame you for not using tensors. (See next section.)

Indexed tensor calculus

Keywords: Ricci calculus, Einstein summation notation, index notation or subscript notation.

If you crack open a tensor textbook you get a lot of guff about general relativity and tensor fields and such, which is all very nice but not germane. We want the very entry level bit, which is some tidy notation conventions for dealing with multilinear operations without too many squiggles all over the place.

Soeren Laue, Matthias Mitterreiter, Joachim Giesen and Jens K. Mueller have been popularising this approach recently. In their paper, they argue that derivation of matrix differential results results can be greatly simplified with Ricci calculus, and ps it often induces faster code.

They have a website. which showcases this trick to do symbolic matrix calculus online (not the accelerated code generation bit.)

Here’s tasty readings on relevant bits of tensor machinery.


Alexander Graham. 1981. Kronecker Products and Matrix Calculus: With Applications. Horwood.

Darrell A. Turkington. 2001. Matrix Calculus Zero-One Matrices. Cambridge University Press.

Gene H. Golub, and Charles F. van Loan. 1983. Matrix Computations. JHU Press.

George A. F. Seber. 2007. A Matrix Handbook for Statisticians. Wiley.

Giles, Mike B. 2008. “Collected Matrix Derivative Results for Forward and Reverse Mode Algorithmic Differentiation.” In Advances in Automatic Differentiation, edited by Christian H. Bischof, H. Martin Bücker, Paul Hovland, Uwe Naumann, and Jean Utke, 64:35–44. Berlin, Heidelberg: Springer Berlin Heidelberg.

Laue, Soeren, Matthias Mitterreiter, and Joachim Giesen. 2018. “Computing Higher Order Derivatives of Matrix and Tensor Expressions.” In Advances in Neural Information Processing Systems 31, edited by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, 2750–9. Curran Associates, Inc.

Magnus, Jan R., and Heinz Neudecker. 1999. Matrix Differential Calculus with Applications in Statistics and Econometrics. Rev. ed. New York: John Wiley.

Minka, Thomas P. 2000. “Old and New Matrix Algebra Useful for Statistics.”

Parr, Terence, and Jeremy Howard. 2018. “The Matrix Calculus You Need for Deep Learning,” February.

Petersen, Kaare Brandt, and Michael Syskind Pedersen. 2012. “The Matrix Cookbook.”

Willi-Hans Steeb. 2006. Problems and Solutions in Introductory and Advanced Matrix Calculus. World Scientific.