Corresponding matrix field basis

156 Views Asked by At

enter image description here

Hi people, I'm reviewing my notes for an exams and this is a question which I was unable to wrap my head around for many months. It should be fairly simple but I might be lacking a crucial piece of information. From where does $$\begin{bmatrix}0 & 0 \\\frac{-1}{r} & 1 \end{bmatrix}\begin{bmatrix}0 \\-1 \end{bmatrix}=\begin{bmatrix}0 \\-1 \end{bmatrix}$$ follows?

1

There are 1 best solutions below

0
On BEST ANSWER

It comes about from a change of basis for $\Bbb R^2$.

Instead of using the normal basis vectors $\hat{\mathbf i} = (1,0)$ and $\hat{\mathbf j} = (0,1)$. It uses the basis defined by $\hat{\mathbf e}_1 = (x,y)$ and $\hat{\mathbf e}_2 = (-y,x)$. Calling the vector for the original equation $\hat u = (y, -x)$, we see that $\hat u = -\hat{\mathbf e}_2 = 0\hat{\mathbf e}_1 + (-1)\hat{\mathbf e}_2$. So in terms of this basis system (denoted by the $B$ subscript on the matrices): $$\hat u = \begin{bmatrix} 0 \\ -1\end{bmatrix}_B$$

In general, there is a "change of coordinates" matrix, call it $M$, that converts coodinates in the new basis back to coordinates in the original. For example, $$\begin{align}a\hat{\mathbf e}_1 + b\hat{\mathbf e}_2&=a(x\hat{\mathbf i} + y\hat{\mathbf j}) + b(-y\hat{\mathbf i} + x\hat{\mathbf j})\\&=(ax - by)\hat{\mathbf i} + (ay + bx)\hat{\mathbf j}\end{align}$$ We can express this in matrices by: $$\begin{bmatrix}ax - by\\ay + bx\end{bmatrix}_C = \begin{bmatrix} x & -y\\y & x\end{bmatrix}\begin{bmatrix}a\\b\end{bmatrix}_B$$ Where I am using $C$ to indicate coordinates with respect to the canonical basis $\{\hat{\mathbf i},\hat{\mathbf j}\}$. So $$M=\begin{bmatrix} x & -y\\y & x\end{bmatrix}$$ To convert in the other direction, you invert: $V_B = M^{-1}V_C$, or: $$\begin{bmatrix}{cx + dy\over x^2 + y ^2}\\{-cy + dx\over x^2 + y ^2}\end{bmatrix}_B = {1\over x^2 + y ^2}\begin{bmatrix} x & y\\-y & x\end{bmatrix}\begin{bmatrix}c\\d\end{bmatrix}_C$$

Now your original expression is $A_C\hat u_C = \hat u_C$, expressed in $C$ coordinates, since that is what we know $A$ in. If we want to express it in $B$ coordinates, we can use $\hat u_C = M\hat u_B$ express this as: $A_CM\hat u_B = M\hat u_B$, or $M^{-1}A_CM\hat u_B =\hat u_B$. So we see that $A_B = M^{-1}A_CM$

So presumably, $A_B = \begin{bmatrix} 0 & 0\\ 1\over r & 1\end{bmatrix}$. But this is where I differ. I calculate $A_B = \begin{bmatrix} 0 & 0\\ 1 & 1\end{bmatrix}$.