The Living Thing / Notebooks :

Linear algebra

If the thing is twice as big, the transformed version of the thing is also twice as big. {End}

Oh! the hours I put in to studying the taxonomy and husbandry of matrices. Time has passed. I have forgotten much. Jacobians have begun to seem downright Old Testament.

And when you put the various operations of matrix calculus into the mix (derivative of trace of a skew-hermitian heffalump painted with a camel-hair brush) the combinatorial explosions of theorems and identities is intimidating.

Things I need to learn:

Basic linear algebra intros

for a linear operator T on a real inner product space,

$$ \langle T x, x \rangle = 0 \,\, \forall x \,\, \iff \,\, T^\ast = -T $$

whereas for an operator on a complex inner product space,

$$ \langle T x, x \rangle = 0 \,\, \forall x \,\, \iff \,\, T = 0. $$

Cool.

Without using determinants, we will define the multiplicity of an eigenvalue and prove that the number of eigenvalues, counting multiplicities, equals the dimension of the underlying space. Without determinants, we’ll define the characteristic and minimal polynomials and then prove that they behave as expected. Next, we will easily prove that every matrix is similar to a nice upper-triangular one. Turning to inner product spaces, and still without mentioning determinants, we’ll have a simple proof of the finite-dimensional Spectral Theorem.

Determinants are needed in one place in the undergraduate mathematics curriculum: the change of variables formula for multi-variable integrals. Thus at the end of this paper we’ll revive determinants, but not with any of the usual abstruse definitions. We’ll define the determinant of a matrix to be the product of its eigenvalues (counting multiplicities). This easy-to-remember definition leads to the usual formulas for computing determinants. We’ll derive the change of variables formula for multi-variable integrals in a fashion that makes the appearance of the determinant there seem natural.

He wrote a whole textbook on this basis, Axle14.

Most of the time when people talk about linear algebra even mathematicians), they’ll stick entirely to the linear map perspective or the data perspective, which is kind of frustrating when you’re learning it for the first time. It seems like the data perspective is just a tidy convenience, that it just“makes sense” to put some data in a table. In my experience the singular value decomposition is the first time that the two perspectives collide, and (at least in my case) it comes with cognitive dissonance.

Linear algebra and calculus

The multidimensional statistics/control theory workhorse.

See matrix calculus.

Multilinear Algebra

Oooh you are playing with tensors? I don’t have a bunch to say here but here is a compact explanation of Einstein summation, which turns out to be as simple as it needs to be, but no simpler.

Refs