Generalizing the entries of a (3x3) symmetric matrix and calculating the projection onto its range

176 Views Asked by At

I am still new to linear algebra and am having trouble understanding the concept of "rank-1" such that I can be certain that I am properly applying it in my attempt to solve the following problem. I would be very grateful if someone could clarify a) which of the two solution approaches below is correct and b) which false assumptions I made to arrive at thinking that the incorrect approach might be correct.

The assignment prompt reads as follows:

Suppose $a,b,c \in \mathbb{R} \setminus \left\{0\right\}$ and I tell you that $A$ is a $3\times3$ symmetric rank-1 matrix and that the first row of $A$ is $\begin{pmatrix} a & b & c\end{pmatrix}$. Fill in the rest of the entries for $A$. Then calculate the projection onto the range of $A$.

I understand "rank-1" to mean that a maximum of one column vector in $A$ is linearly independent. The others are thus linearly dependent. I begin both solution approaches with the fact that $A$ must be symmetric, writing:

$A=\begin{pmatrix}a & b & c \\ b & \gamma_1 & \gamma_2 \\ c & \gamma_2 & \gamma_3 \end{pmatrix}$

where $\gamma_1,\gamma_2,\gamma_3$ are to be further specified using the constraint that $A$ is rank-1.

Here is where my confusion about the implications of "rank-1" leads to diverging solution approaches.

APPROACH (A)

Since at least one column vector must be linearly independent, I choose the first column $\begin{pmatrix} a \\ b \\ c \end{pmatrix}$ to be linearly independent of the other two and the other two columns to be linearly dependent. This leads me to three different equations that allow me to define my $\gamma$'s in terms of $a,b,c$.

Because I assume the first column to be linearly independent of the other two:

(1) $\begin{pmatrix} a \\ b \\ c \end{pmatrix} \cdot \begin{pmatrix} b \\ \gamma_1 \\ \gamma_2 \end{pmatrix} = ab+\gamma_1 b + \gamma_2c \overset{!}{=} 0$

(2) $\begin{pmatrix} a \\ b \\ c \end{pmatrix} \cdot \begin{pmatrix} c \\ \gamma_2 \\ \gamma_3 \end{pmatrix} = ac + \gamma_2 b + \gamma_3 c \overset{!}{=} 0$

...and because I assume the second and third columns to be linearly dependent:

(3) $\begin{pmatrix} b \\ \gamma_1 \\ \gamma_2 \end{pmatrix} = \beta \begin{pmatrix} c \\ \gamma_2 \\ \gamma_3 \end{pmatrix}$

where $\beta \in \mathbb{R}$.

Starting with (3), I then get

(i) $b = \beta c $

(ii) $\gamma_1 = \beta \gamma_2 $

(iii) $\gamma_2 = \beta \gamma_3 $

From (i), I get $\beta = \frac{b}{c}$ and thus

(ii) $\gamma_1 = \frac{b}{c} \gamma_2 = \frac{b^2}{c^2} \gamma_3$

(iii) $\gamma_2 = \frac{b}{c} \gamma_3$

Plugging these into equation (1), I then get

(1) $ab + \frac{b^3}{c^2}\gamma_3 + b\gamma_3 = 0$

and work things out for $\gamma_3$.

Dividing through $b$:

(1) $a + \frac{b^2}{c^2}\gamma_3 + \gamma_3 =0$

Factoring:

(1) $a + \gamma_3 \left ( \frac{b^2+c^2}{c^2}\right ) = 0$

And thus

$\gamma_3 = - \frac{ac^2}{b^2+c^2}$

This then gives

$\gamma_1 = - \frac{ab^2}{b^2+c^2}$

and

$\gamma_2 = - \frac{abc}{b^2+c^2}$

The matrix $A$ is then generalized as

$A = \begin{pmatrix} a & b & c \\ b & -\frac{ab^2}{b^2+c^2} & - \frac{abc}{b^c+c^2} \\ c & - \frac{abc}{b^c+c^2} & -\frac{ac^2}{b^2+c^2}\end{pmatrix}$

This still fulfills the symmetry condition.

I then find the projection onto the range of $A$ to be given by $P = \begin{pmatrix} 1 & 0 & 0\\ 0& 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$.

APPROACH (B)

I am worried that my understanding of rank-1 above is incorrect and that rank-1 actually means that I can simply choose one of the three columns to be the "lowest common factor" of the other two, such that all column vectors are points along the same line. In that case, I would set up things quite differently, e.g., once again choosing the first column vector as my starting point:

(1) $\begin{pmatrix} a \\ b \\ c \end{pmatrix} = \beta \begin{pmatrix} b \\ \gamma_1 \\ \gamma_2\end{pmatrix} $

(2) $\begin{pmatrix} a \\ b \\ c \end{pmatrix} = \alpha \begin{pmatrix} c \\ \gamma_2 \\ \gamma_3\end{pmatrix}$

From that approach's system of equations, I then get $\beta = \frac{a}{b}$, $\alpha = \frac{a}{c}$ and then

$\gamma_1 = \frac{b^2}{a}$

$\gamma_2 = \frac{ca}{b}$

$\gamma_3 = \frac{c^2}{a}$

And thus the resulting matrix is

$\begin{pmatrix} a & b & c \\ b & \frac{b^2}{a} & \frac{bc}{a} \\ c & \frac{cb}{a} & \frac{c^2}{a}\end{pmatrix}$

The projection onto the range of $A$ is here once again the identity matrix.

This outcome "feels" more correct because the constraint that $a,b,c$ may not be 0 simply jumps out more here (no dividing by zero allowed!).

But such "gut feelings" are obviously not how I want understand the correct approach to the problem. My understanding of the implications of rank-1 should be solid enough that I know which one of these is the right way to go.

If approach (B) is in fact correct, then how can we claim that rank-1 means a maximum of one linearly independent column when in fact all three in this case can actually be written as scalar multiples of another?

Thank you for taking the time to read this.

  • EDIT: Corrected for $\gamma_2 = \frac{bc}{a}$ in resulting matrix for $A$ (approach B).
1

There are 1 best solutions below

5
On BEST ANSWER

When a set of vectors have rank $k$, it means that there are $k$ independent vectors in the set, and the rest of the vectors are linear combinations of those $k$ vectors (they are dependent on the $k$ vectors). The $k$ independent vectors contribute all the dimensions by themselves, and the rest contribute nothing. (There could be multiple ways to choose the independent vectors; if they are non-zero multiples of each other, then choosing any one will do.) It's not enough for the rest to be dependent by themselves; try plugging in $a = b = c = 1$ to your matrix from approach $A$, and you'll see the other two columns are independent of the first, so they contribute extra dimensions and increase the range.

So when the columns have rank $1$, it means that there is one vector that is independent by itself (it is non-zero) and the rest depend on it (they are multiples of it). This is your approach B, which is mostly correct.

The condition you use in your first answer is not correct. Rank $k$ means that the remaining vectors are dependent on the $k$ independent vectors (they are in the span of the $k$ independent vectors), not just that they are dependent by themselves.

I'm not exactly sure how you computed the projection onto the range of the matrix, but it's not right: the projection should also have rank $1$, because any two matrices with the same range have the same rank. (The rank is the dimension of the range).