Finding x in matrix algebra

127 Views Asked by At

The question asked me to calculate $|A|$ for the given matrix below and state whether or not it is invertible.

$$ \left( \begin{matrix} 1 & 0 & 2 & 1\\ 0 & x & x & x\\ x & x & 3x & 1\\ 0 & x & x & 0\\ \end{matrix} \right) $$

I got my answer to be $-2x^3 + 2x^3 = 0$, indicating that the matrix is not invertible because the determinant is a zero, regardless of the value of $x$.

However, I'm not certain whether I'm right after doing all the calculations. Could you help me verify?

3

There are 3 best solutions below

0
On BEST ANSWER

Okay, here's every step:

$$\left|\begin{matrix} x & x & x \\ x & 3x & 1 \\ x & x & 0\end{matrix}\right|+2\left|\begin{matrix} 0 & x & x \\ x & x & 1 \\ 0 & x & 0\end{matrix}\right| - \left|\begin{matrix} 0 & x & x \\ x & x & 3x \\ 0 & x & x\end{matrix}\right| \\ = x^2\left|\begin{matrix} 1 & 1 & x \\ 1 & 3 & 1 \\ 1 & 1 & 0\end{matrix}\right|+2x^2\left|\begin{matrix} 0 & 1 & x \\ 1 & 1 & 1 \\ 0 & 1 & 0\end{matrix}\right| - x^3\left|\begin{matrix} 0 & 1 & 1 \\ 1 & 1 & 3 \\ 0 & 1 & 1\end{matrix}\right| \\ = x^2\left(\left|\begin{matrix} 3 & 1 \\ 1 & 0\end{matrix}\right| - \left|\begin{matrix} 1 & 1 \\ 1 & 0\end{matrix}\right| + x\left|\begin{matrix} 1 & 3 \\ 1 & 1\end{matrix}\right|\right) + 2x^2\left(- \left|\begin{matrix} 1 & 1 \\ 0 & 0\end{matrix}\right| + x\left|\begin{matrix} 1 & 1 \\ 0 & 1\end{matrix}\right|\right) -x^3\left(- \left|\begin{matrix} 1 & 3 \\ 0 & 1\end{matrix}\right| + \left|\begin{matrix} 1 & 1 \\ 0 & 1\end{matrix}\right|\right) \\ = -2x^3+2x^3-0 \\ =0$$

So you did it correctly.


Or, or course, for the easier way, just recognize that twice the first column plus the second column equals the third column of your matrix so the determinant must be $0$.

0
On

Recall that performing row replacement operations does not affect the determinant. Hence, if we replace row $3$ with (row $3$ + $(-x) \cdot $ row $1$), then replace row $3$ with (row $3$ - row$2$) and replace row $4$ with (row $4$ - row $2$), we obtain: $$ \begin{vmatrix} 1 & 0 & 2 & 1\\ 0 & x & x & x\\ x & x & 3x & 1\\ 0 & x & x & 0\\ \end{vmatrix} = \begin{vmatrix} 1 & 0 & 2 & 1\\ 0 & x & x & x\\ 0 & x & x & 1-x\\ 0 & x & x & 0\\ \end{vmatrix} = \begin{vmatrix} 1 & 0 & 2 & 1\\ 0 & x & x & x\\ 0 & 0 & 0 & 1-2x\\ 0 & 0 & 0 & -x\\ \end{vmatrix} $$ but since this last matrix is upper triangular, its determinant is the product of the diagonal entries. But one of the diagonal entries is a zero, so $\det A = 0$, as you computed.

0
On

$A$ has zero determinant.

If you row reduce based on the first row first column, then the 2nd and 3rd columns of the new matrix become $$ \begin{array}{cc} x&x \\ x&x \\x&x \end{array} $$ Then when you divide the second row by $x$ and row reduce based on the 2nd row 2nd column you get the matrix $$\left( \begin{array}{cccc} 1&0&2&1 \\ 0&1&1&1 \\ 0&0&0&1-2x \\ 0&0&0&-x \end{array} \right) $$ The last two rows of this matrix are clearly multiples of one another, thus linearly depended, so the determinant will be zero.