vectors are a special representation of matrices.

117 Views Asked by At

going through vectors,i noted that a vector is nothing but a special representation of matrices.

let $\overrightarrow{A}$=a$_{1}$$\hat{i}$+a$_{2}$$\hat{j}$+a$_{3}$$\hat{k}$=$\begin{pmatrix}a_{1}\\ a_{2}\\ a_{3} \end{pmatrix}$and $\overrightarrow{B}$=b$_{1}$$\hat{i}$+b$_{2}$$\hat{j}$+b$_{3}$$\hat{k}$=$\begin{pmatrix}b_{1}\\ b_{2}\\ b_{3} \end{pmatrix}$.

Now we can perform every vector operation in form on matrices including vector addition, scaler multiplication,dot product and so on example:

$\overrightarrow{A}$.$\overrightarrow{B}$=A$^{T}$B= $\begin{pmatrix}a_{1} & a_{2} & a_{3}\end{pmatrix}$$\begin{pmatrix}b_{1}\\ b_{2}\\ b_{3} \end{pmatrix}$=$\begin{pmatrix}a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}\end{pmatrix}$

One thing i can not imagine is how to do cross product of vectors in form of matrices Note:Do'nt get confuse with determinants.My question is only about Matrices

2

There are 2 best solutions below

1
On BEST ANSWER

About hot to calculate the cross product:

We start with the outer product of two vectors (the one which you mis-identified as scalar product): $$M=ab^T = \begin{pmatrix} a_1 b_1 & a_1 b_2 & a_3 b_3\\ a_2 b_1 & a_2 b_2 & a_2 b_3\\ a_3 b_1 & a_3 b_2 & a_3 b_3 \end{pmatrix}$$ Now we simply subtract the transposed matrix, $M^T=ba^T$, to get $$B=ab^T-ba^T = \begin{pmatrix} 0 & a_1 b_2 - a_2 b_1 & a_1 b_3 - a_3 b_1\\ a_2 b_1 - a_1 b_2 & 0 & a_2 b_3 - a_3 b_2\\ a_3 b_1 - a_1 b_3 & a_3 b_2 - a_2 b_3 & 0 \end{pmatrix}$$ Here you already see the coefficients of the cross product, but they are scattered throughout the matrix. So we need to extract them. For this, we need the following three matrices (which together form a representation of the so-called Levi-Civita tensor; more on this below): $$\epsilon_1 = \begin{pmatrix} 0 & 0 & 0\\ 0 & 0 & 1\\ 0 & -1 & 0 \end{pmatrix}, \epsilon_2 = \begin{pmatrix} 0 & 0 & -1\\ 0 & 0 & 0\\ 1 & 0 & 0 \end{pmatrix}, \epsilon_3 = \begin{pmatrix} 0 & 1 & 0\\ -1 & 0 & 0\\ 0 & 0 & 0 \end{pmatrix}$$ Now let's see what happens if we calculate e.g. $\epsilon_1 B$: $$\epsilon_1 B = \begin{pmatrix} 0 & 0 & 0\\ a_3 b_1 - a_1 b_3 & a_3 b_2 - a_2 b_3 & 0\\ a_1 b_2 - a_2 b_1 & 0 & a_3 b_2 - a_2 b_3 \end{pmatrix}$$ As you see, in the diagonal, only the terms for component $1$ of the cross product appear, although with the wrong sign (but we can easily fix that). So how to extract that? Well, there's a standard function on matrices called the trace, which is just the sum of the diagonal elements. So we get $\operatorname{tr}(\epsilon_1 B) = 2(a_1 b_2 - a_2 b_1)$. Multiplying with $-\frac12$ then gives the first component of the cross product. So we just need to gt it into the first component of the vector, but that's easy: Using the standard basis $$e_1 = \begin{pmatrix}1\\0\\0\end{pmatrix}, e_2 = \begin{pmatrix}0\\1\\0\end{pmatrix}, e_3 = \begin{pmatrix}0\\0\\1\end{pmatrix}$$ we see that we just have to multiply the component with $e_1$. It is easy to check that the same works also for the other components, so putting everything together, we finally arrive at the following formulafor the cross product: $$a\times b = \frac12\sum_{k=1}^3 \operatorname{tr}\left(\epsilon_k (b^Ta- a^Tb)\right) e_k\tag{1}$$

Now what are those mysterious matrices $\epsilon_i$? Well, let_s look at the element in row $j$ and column $k$, $(\epsilon_i)_{jk}$. And now let's just remove the parentheses, to get $\epsilon_{ijk}=(\epsilon_i)_{jk}$. Then you can easily check the following properties of $\epsilon_{ijk}$:

  • $\epsilon_{123} = 1$
  • $\epsilon_{ijk} = \epsilon_{jki} = \epsilon_{kij}$
  • $\epsilon_{ijk} = \epsilon_{kji}$ (the latter is seen in the fact that the matrices above are antisymmetric).
  • $\epsilon_{ijj} = 0$ (seen in the fact that all diagonal entries of the matrices are $0$)

Or in other words, $\epsilon_ijk$ changes sign whenever two indices are exchanged, and is zero whenever two indices are equal. The $\epsilon_{ijk}$ as described above is known as the Levi-Civita tensor; I just packaged it into three matrices in the obvious way.

Note that $(1)$ can be simplified by using the facts that $\operatorname{tr}(A+B) = \operatorname{tr}(A) + \operatorname{tr}(B)$ (easily seen from the fact that when adding matrices you also add their diagonal elements) and that $\operatorname{tr}\left(A(uv^T)\right)=v^TAu$. Using this (and the fact that the $\epsilon_i$ are antisymmetric), we can rewrite $(1)$ as: $$a\times b = \sum_{k=1}^3 e_k (a^T\epsilon_k b)$$ Note that I've written $e_k$ at the beginning, as that way the expression remains valid if removing the parentheses; with $e_k$ at the end, the parentheses would be mandatory for the expression to be defined.

10
On

Matrices are a special representation of vectors, not the other way around.

For example, you have given the vector $\vec{A} = a_1\hat{i} + a_2\hat{j} + a_3\hat{k}$. This is a representation of the vector $\vec{A}$ and a linear combination of the basis vectors $\hat{i}, \hat{j}$ and $\hat{k}$. When you choose to write the vector as a matrix, it is under stood that the elements of the matrix, are the coefficients of a linear combination of basis vectors. That is, \begin{equation} \begin{bmatrix} a_1 \\ a_2 \\ a_3 \end{bmatrix} \end{equation} should be understood to mean the linear combination $a_1\hat{i} + a_2\hat{j} + a_3\hat{k}$. Thus, it doesn't make sense to represent a vector as a matrix without defining the basis vectors.

As for the dot product in matrix form, first consider the dot product of vectors: \begin{equation} \vec{A}\cdot\vec{B} = \sum_{i=1}^3 a_ib_i. \end{equation} So if we let \begin{equation} A = \begin{bmatrix} a_1 \\ a_2 \\ a_3 \end{bmatrix},\ B = \begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix}, \end{equation} we find that $A^\top B$ gives the correct result.