Let $f\colon V\rightarrow W$ be a linear transformation with $V,W$ $\mathbb{K}$-vector spaces and $U_{1},U_{2}$ subspaces of $W$ Show that:
$f^{-1}(U_{1})+f^{-1}(U_{2})\subset f^{-1}(U_{1}+U_{2})$
Now my idea would be to show that all basis vectors of $f^{-1}(U_{1})$ , $f^{-1}(U_{2})$ are in $f^{-1}(U_{1}+U_{2})$ and therefore every other $v$ of $L.H.S$ aswell. However my proof seems a little clumsy and therefore I would appreciate some other ideas.
If $x \in f^{-1}(U_1)+f^{-1}(U_2)$, then $x$ can be written as the sum of an element $y \in f^{-1}(U_1)$ and $z \in f^{-1}(U_2)$. Then $f(x) = f(y+z) = f(y)+f(z) \in U_1+U_2$, hence $x \in f^{-1}(U_1+U_2)$.