I need to compute Arrow securities $e_1 = (1,0,0), e_2 = (0,1,0)$ and $e_3 = (0,0,1)$ as a linear combination of the $r_1, r_2, r_3, r_4$ that are represented as the columns in the following matrix:
$$ \begin{pmatrix} 1 & 2 & 2 & 6 \\ 1 & 0 & 1 & 3 \\ 1 & 0 & 1 & 1 \\ \end{pmatrix}$$
I tried to solve this by augmenting the Identity matrix on the right and than solving it by writing it in row echelon form. I got the following result:
$$\left(\begin{array}{cccc|ccc} 1 & 0 & 1 & 0 & 0 & -1/2 & 3/2\\ 0 & 1 & 1/2 & 0 & 1/2 & -5/4 &3/4\\ 0 & 0 & 0 & 1 & 0 & 1/2 & -1/2\\ \end{array}\right)$$
From the solution manual, I know that:
$$ \begin{aligned} e_1 &= 0.5 r_2 \\ e_2 &= 0.5r_4 - 0.5r_1 - 1.25r_2 \\ e_3 &= r_1 -e_1 - e_2 \end{aligned} $$
I can see some relation between this solution and my computations. But how can I exactly deduce this from my computations? What is the intuition behind it?
The first difficulty is that there are many possible solutions. Suppose that the columns (in a 2 x 3 case) were $(1, 0), (2, 0), (0, 1)$ and you have to express $(4, 0)$ as a combination of columns. Then "4 times column 1" is an answer, as is "2 times column 2". In general, linear dependence (which you'll get when you have more columns than dimensions) leads to this sort of multiple correct answer" situation.
Your example is just like this, except that the redundancy isn't quite so obvious.
The second difficulty is more fundamental: When you adjoin the identity to the right of a square matrix $M$ and do row ops on both, each row op corresponds to LEFT multiplication by an elementary matrix. So after one op, you have $$ E_1 M | E_1 I $$ After two, you have $$ E_2 E_1 M | E_2 E_1 I $$ Notice that at any point, the left matrix is the right matrix times M. If you reduce all the way to the identity on the left, you have $$ I | E_k \cdots E_2 E_1 I $$ with $(E_k \cdots E_2 E_1) M = I$, so you've expressed every row of $I$ as the result of a sequence of row-ops on$M$. When $M$ isn't square, you achieve almost the same thing. The key point is that this gets you rows of $I$ as combinations of rows of $M$. That won't help you solve your problem, which involves columns.
Solution: append a $4 \times 4$ identity BELOW $M$, like this: $$\begin{bmatrix} 1 & 2 & 2 & 6 \\ 1 & 0 & 1 & 3 \\ 1 & 0 & 1 & 1 \\ 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1 \end{bmatrix}$$ and then do column ops. The result will be a $3 \times 4$ matrix $A$ at the top, and a $4 \times 4$ matrix $Q$ on the bottom with $MQ = A$. If you've done your column-reduction right, three of $A$'s columns will be the standard basis vectors [$(1,0,0), (0,1,0), (0, 0, 1)$] and $Q$ will show one way to express these as linear combinations ($Q$) of the columns of $M$. The remaining column of $A$ will be all zeros, and the corresponding row of $Q$ will show a way to express $(0,0,0)$ as a combination of columns of $M$. You can add or subtract this from your expression for $(1,0,0)$ as a combination of columns of $M$ to get a NEW way to express it as a combination of columns, showing that the solution to the problem is not unique.