Can linearity on subspaces imply linearity on their direct sum?

48 Views Asked by At

I read an interesting fact while learning calculus of functions of several variables: for fixed bases in $\mathbb{R}^m$ and $\mathbb{R}^n$, the linear mapping $L:\mathbb{R}^m \rightarrow \mathbb{R}^n$ can be regarded as a set $L = (L^1,\ldots,L^n)$ of n (coordinate) mappings $L^j:\mathbb{R}^m \rightarrow \mathbb{R}$ and we have: \begin{equation}L: \mathbb{R}^m \rightarrow \mathbb{R}^n\end{equation}is linear if and only if each mapping: \begin{equation} L^j:\mathbb{R}^m\to \mathbb{R}\end{equation} in the set is linear. This is a fact we use to build differential calculus of functions of several variables and I've understood it. But I've also been learning linear algebra, so I wondered if the fact above can be generalized and then raised several questions below:
suppose vector spaces $V$ and $W$ are finite dimensional, $W^1,\ldots,W^n$ are subspaces of $W$ and $W^1\oplus \cdots \oplus W^n = W $
Q1: Is a linear map $L: V \rightarrow W$ "equivalent" to n linear maps $L^j : V \rightarrow W^j$? If it is, what are the expressions of these $L's$? And how to prove it?
Q2: Is the claim "linearity of $L's$ and $L$ is equivalent" still valid?

1

There are 1 best solutions below

0
On

I'm sorry, this is trivial.
$\Rightarrow$: suppose we have a linear function $T \in \mathcal{L}(V,W)$, $W$ has dimension $n$. For a $v \in V$, we have $Tv \in W$, by the definition of a direct sum, we can write $Tv$ in the form $w_1+\cdots+w_n$, define function $T^j:V\rightarrow W^j$ by $T^jv = w_j$, we can prove $T^j$ is linear. In fact, let's say for a vector $v_1$ in $V$, we have: \begin{equation}Tv_1 = w_1^1+\cdots+w_1^n \end{equation} $w_1^j$ above is in $W^j$. Similarly, for vector $v_2$ in $V$, we have \begin{equation} Tv_2 = w_2^1+ \cdots+w_2^n\end{equation} it's easy to find by the definition of $T^j$ that $T^jv_1 = w_1^j$, and $T^jv_2 = w_2^j$, so $T^jv_1 + T^jv_2=w_1^j+w_2^j$.
using linearity of $T$, $T(v_1+v_2)=Tv_1+Tv_2=(w_1^1+w_2^1)+\cdots+(w_1^n + w_2^n)$.
on the other hand, again, using definition of $T^j$, \begin{equation}T^j(v_1+v_2)=w_1^j+w_2^j=T^jv_1+T^jv_2 \end{equation}
and for $Tv=w_1+\cdots+w_n$, $T(\lambda v)=\lambda (Tv)=\lambda w_1+\cdots+ \lambda w_n$, so \begin{equation}T^j (\lambda v)= \lambda w_j=\lambda (T^jv)\end{equation} thus we proved "for a given linear transformation $T$ above, we can uniquely define a set of n linear transformations $(T^1, \ldots, T^n)$, and the linearity of $T$ implies linearity of $T's$".
$\Leftarrow$: the proof of the other direction is simple, suppose we have a bunch of linear maps $T^j:V \rightarrow W^j $, since each $W^j$ is a subspace of W, we rewrite $T^j$ in the form $T^j:V \rightarrow W$. Note that $T^j \in \mathcal{L}(V,W)$ and $\mathcal{L}(V,W)$ itself is a vector space. So for $T^j,j=1,\cdots n$, we can determine a unique linear map $T=\sum_{j=1}^{n} T^j,T \in \mathcal{L}(V,W)$.