Solving system of linear equations using orthogonal matrix

1k Views Asked by At

I'm given the following matrix:

$$ A = \begin{pmatrix} \frac{1}{\sqrt{2}} & 0 & -\frac{1}{\sqrt{2}} \\ 0 & 1 & 0 \\ \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \end{pmatrix} $$

We're asked to determine if this matrix is orthogonal. I did this successfully using the rule that a matrix $A$ is orthogonal iff $A^T = A^{-1}$. The second part of the question however, I find to be difficult to answer since I have no idea how to start. The question is as follows: How would you determine a solution for the system of linear equations $A\vec{x} = \vec{b}$ using the orthogonal matrix above.

I have determined that $A\vec{x} = \vec{b}$ translates to:

$$ \begin{pmatrix} \frac{1}{\sqrt{2}} & 0 & -\frac{1}{\sqrt{2}} \\ 0 & 1 & 0 \\ \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \end{pmatrix} \begin{pmatrix} x \\y \\z \end{pmatrix} = \begin{pmatrix} b_1 \\ b_2 \\ b_3 \end{pmatrix} $$ yet I have no idea how the above matrix could help me solve this system easily. Anyone have any idea on how you would answer this type of question?

1

There are 1 best solutions below

0
On BEST ANSWER

The point of showing your matrix was orthogonal was that you then know the inverse. Once you know that the columns/row are perpendicular (dot product zero) for all columns that are not themselves and each column/row are unit vectors, then you know the matrix is orthogonal. $$ \begin{aligned} v_1&= \begin{pmatrix} \dfrac{1}{\sqrt{2}} \\ 0 \\ \dfrac{1}{\sqrt{2}} \end{pmatrix} \\ v_2&= \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix} \\ v_3&= \begin{pmatrix} -\dfrac{1}{\sqrt{2}} \\ 0 \\ \dfrac{1}{\sqrt{2}} \end{pmatrix} \\ v_1 \cdot v_1&= \dfrac{1}{2} + 0 + \dfrac{1}{2}= 1 \\ v_2 \cdot v_2&= 0 + 1 + 0 = 1 \\ v_3 \cdot v_3&= \dfrac{1}{2} + 0 + \dfrac{1}{2} \\ v_1 \cdot v_2&= 0 \\ v_1 \cdot v_3&= 0 \\ v_2 \cdot v_3&= 0 \end{aligned} $$ This shows that $U$ is orthogonal. Then if $U$ is a orthogonal matrix, we know $U^{-1}= U^T$. Then knowing the inverse, we can solve the system $Ux=b$ via $$ \begin{aligned} Ux&= b \\ U^{-1}Ux&= U^{-1}b \\ U^TUx&= U^Tb \\ x&= U^Tb \end{aligned} $$

NOTE. I originally used the word unitary. A unitary matrix is just the complex number version of an orthogonal matrix. So if the matrix is real then it is orthogonal if and only if it is unitary. I do prefer the word unitary, because it reminds what you need to check - the columns (or rows) are perpendicular to each other and each row/column has unit length (length 1).