What effect does transposing the matrix of a linear transformation have on the transformation?

818 Views Asked by At

I came across the following result in a book on Mathematical Physics, where it's given without proof. This is related to: How to prove that a homogeneous system of equations along with its "transposed" system contain the same number of linearly independent solutions?.

But I'm trying to prove it using minimal concepts related to matrices (i.e. no row echelon, etc.), so I'm not interested in the answer to the linked question.

For the system of equations: $$\tag1 \sum_{k=1}^n a_{ik}x_k = y_i, ~~~~~~~~~~~~~~ (i=1,2,\ldots, n)$$

the following alternative holds: Either it has one and only one solution $\bf{x}$ for each arbitrarily given vector $\bf{y}$, in particular the solution $\bf{x}=0$ for $\bf{y}=0$; or, alternatively, the homogeneous equations arising from $(1)$ (when $\bf{y}=0$) have a positive number $m$ of nontrivial linearly independent solutions $\bf x_1, x_2, \ldots, x_m$, which may be assumed to be normalized. In the latter case, the "transposed" homogeneous system of equations

$$\sum_{k=1}^n a'_{ik}x_k = 0, ~~~~~~~~~~~~~~ (i=1,2,\ldots, n)$$

where $a'_{ik} = a_{ki}$, also has exactly $m$ linearly independent nontrivial solutions $\bf x'_1, x'_2, \ldots x'_m$. The inhomogeneous system $(1)$ then possesses solutions for just those vectors $\bf y$ which are orthogonal to $\bf x'_1, x'_2, \ldots x'_m$. These solutions are determined only to within an additive term which is an arbitrary solution of the homogeneous system of equations, i.e. if $\bf x$ is a solution of the inhomogeneous system and $\bf x_j$ is any solution of the homogeneous system, then $\bf x+x_j$ is also a solution of the inhomogeneous system.

The system can be viewed as a linear transformation $T$ from $\mathbb{R}^n$ to $\mathbb{R}^n$. Either $\dim(\ker(T)) = m > 0$ (i.e. the homogeneous equations arising from $(1)$ (when $\bf{y}=0$) have a positive number $m$ of nontrivial linearly independent solutions), or it is a singleton. In the latter case the linear map is injective, hence bijective (since source and target dimensions are the same). If it's bijective, obviously for any vector $\bf y$ there's a unique preimage (or solution) $\bf x$. This takes care of the first part.

The last part is also clear: if $\bf x_j$$~= (x_{j1}, x_{j2}, \ldots, x_{jn})$ and $\bf x$$~=(x_1, x_2, \ldots, x_n)$, then

$$\sum_{k=1}^n a_{ik}(x_k+x_{jk}) = \sum_{k=1}^n a_{ik}x_k+\sum_{k=1}^n a_{ik}x_{jk} = y_i+0 = y_i$$

Assuming that $T'$ is the linear transformation corresponding to the "transposed" system of equations, the middle part essentially states that $\dim(\ker(T)) = \dim(\ker(T')) = m$, and that $\ker(T')$ and $\operatorname{Im}(T)$ are orthogonal subspaces. How do these 2 facts follow? What effect does transposing the matrix of $T$ have on the resultant transformation $T'$?

Obviously matrices are involved, but I'd still appreciate an answer that relies more on basic principles rather than too many matrix results.

1

There are 1 best solutions below

2
On BEST ANSWER

Maybe this helps: If $v\in\ker A$ then $v\perp a^{i}$ for $i=1,\dots,n$ where $a^{i}$ denote the rows of $A$. Thus $v\perp\overline{\mathtt{span}}\left\{ a^{i}\;\vert\; i=1,\dots,n\right\}$. The $a^{i}$ are also the columns of $A^{T}$. And $\overline{\mathtt{span}}\left\{ a^{i}\;\vert\; i=1,\dots,n\right\} =\mathtt{im}\, A^{T}$ (with appropriate transpose of the vectors).