Are linear transformations in $\mathbb{R}^n$ without matrix representation useful?

502 Views Asked by At

In a first year engineering linear algebra class at my institution, the students learn about general linear transformations. I understand that when working in general, understanding the general properties of a linear transformations is important. However, the students in the class will, for the vast majority of the time, be working in $\mathbb{R}^n$ where matrix representations of linear transformations are extremely well defined. The students are often confused about the distinction between the transformation and its matrix representation.

My question is this: presuming that the students will work in $\mathbb{R}^n$ for their entire career with linear algebra, are there any properties of linear transformations that are more easily taught in general? In other words, if we never even mentioned the term "linear transformation" and spoke only about matrices and matrix/vector algebra, what would be lost?

4

There are 4 best solutions below

0
On

Since your vector space is $n$-dimensional, for every linear transformation from the vector space to itself there always exists a transformation matrix. However, some non-linear transformations can be represented by matrices, too. See here.

3
On

It's similar at the school i teach at. I like to remind them that linear transformation is also something that is well defined on function spaces and then give them some examples like $\frac{d}{dx}$ or $L[y]=y''+2y'+y$ then tell them that even though the spaces these linear operators act on are infinite dimensionsal there are finite dimensional subspaces that are significant e.g. $L[y]=0$ but they'll have to take differential equations to see the specialist techniques for dealing with them.

0
On

One interesting approach that shouldn't take the student too far away from what they are already familiar with might be to consider linear transformations acting on matrices, rather than on vectors. Now of course an $n \times n$ matrix can be identified with a vector in $\mathbb{R}^{n \times n}$, but the extra work caused by the reshaping may help to motivate why a more abstract definition of a linear transformation might be useful.

For example you might consider the "linear transformation" of a matrix $M \mapsto M^T$, and then justifying that this is indeed a linear transformation using both the abstract definition and then also finding an explicit representation using the $\operatorname{vec}(\cdot)$ operator, as for instance in this question.

0
On

Everything would be lost. Linear algebra would just be a mindless data shuffling. There would be no structural insight, nor the geometric intuition coming with it. Think of the so-called Levi-Civita calculus invented in the first half of the last century, with its screes of indices that were moved up and down without any understanding of what was going on. Or the idea of eigenspaces, Sylvester's theorem (important for engineers!), and on, and on.