I am having some troubles with an exercise from "Finite dimensional vector spaces" by Halmos. I am not sure if my proof is correct or if my answer is what the author asked for. Thanks for any help!
Suppose that $m < n$ and that $y_{1}, \dots, y_{m}$ are linear functionals on an $n$-dimensional vector space $\mathbb{V}$. Under what conditions on the scalars $\alpha_{1}, \dots, \alpha_{m}$ is it true that there exists a vector $x$ in $\mathbb{V}$ such that $y_{j}(x) = \alpha_{j}$ for $j = 1, \dots, m$? What does this result say about the solutions of linear equations?
We assume without loss of generality that at least two vectors of $y_{1}, \dots, y_{m}$ are linearly dependent. More precisely there exists a $k, l \leq m$ and $c \in \mathbb{K}$ such that $y_{k}(x_{0}) = cy_{l}(x_{0})$ with $k \neq l$. Now we can write $y_{k}(x_{0}) - cy_{l}(x_{0}) = \alpha_{k} - c\alpha_{l} = 0$. From that we can state that such a vector $x$ in $\mathbb{V}$ exists under the condition that there does exist such a $c \in \mathbb{K}$ such that $\alpha_{k} = c\alpha_{l}$ if $y_{k}$ and $y_{l}$ are linearly dependent.
There is loss of generality in your assumption. For example the sum of all may vanish while pairwise they are independent.
You need to find the kernel of the map $\Lambda: c\in {\Bbb C}^m \mapsto \sum_j c_j y_j$. Then clearly if $c\in \ker \Lambda$, a solution to $y_j(x)=\alpha_j$ must verify: $$ 0=\sum_j c_j y_j(x)= \sum_j c_j \alpha_j$$ So $\alpha$ should be orthogonal to the kernel of $\Lambda$. This happens also to be a sufficient condition. If $\Lambda^* : x\in V \mapsto (y_1(x),\ldots, y_m(x)) \in {\Bbb C}^m$ denotes the dual map to $\Lambda$ then a theorem from (finite dimensional) linear analysis states that $$ \mbox{Im } \Lambda^* = (\ker \Lambda)^T$$ and this translates into the above-mentioned condition.