Math 'equal to?' symbol

579 Views Asked by At

My teacher used the following symbol: $\boxed{\overset{\wedge}{=}}$

We had to write down a vector equation, and he said my direction vector $\begin{pmatrix}6\\2\\2\end{pmatrix}$ could be simplified to $\begin{pmatrix}3\\1\\1\end{pmatrix}$ since length doesn't matter as a direction vector.

He wrote in my vector equation:

$$\begin{pmatrix}6\\2\\2\end{pmatrix} \overset{\wedge}{=} \begin{pmatrix}3\\1\\1\end{pmatrix}.$$

Is this notation correct, and if so, what is the name and when is it used?

3

There are 3 best solutions below

6
On BEST ANSWER

This notation is not standard. One can define an equivalence relation on $\mathbb{R}^{n}$ s.t $v\sim u\iff u=\alpha v$ for $0\neq\alpha\in\mathbb{R}$ and then we can write for example $$ \begin{pmatrix}6\\ 2\\ 2 \end{pmatrix}\sim\begin{pmatrix}3\\ 1\\ 1 \end{pmatrix} $$

Regarding the use of the hat symbol and vectors - it is standard that if $0\neq v\in\mathbb{R}^{n}$ then we denote $$ \hat{v}=\frac{v}{\|v\|} $$

in this case $\hat{v}$ spans the same one dimensional subspace as $v$ and satisfies $\|\hat{v}\|=1$

2
On

I don't think the particular one that you mention is valid. See here for a full list. He or she may have meant to write $ \cong $ or $ \equiv $; both of them make sense, since they both imply that the two vectors are equivalent and of the same form and meaning.

0
On

It is not conventional and will not be broadly understood without an explanation of what it means, but there's nothing incorrect about it if you explain it before using it.

Here's another example of its use: Suppose $H_1,\ldots,H_n$ are mutually exclusive hypotheses one of which must be true. Their probabilities given some new data $D$ are desired. Then $$ (P(H_1\mid D),\ldots,P(H_n\mid D)) \overset{\wedge}{=} (P(H_1),\ldots,P(H_n)) \cdot (P(D\mid H_1),\ldots,P(D\mid H_n)) $$ where the dot means term-by-term multiplication. After thus finding the equivalence class of the vector on the left, the constant by which all components must be multiplied to get the actual probabilities is the one that makes their sum equal to $1$.

And yet another example: A linear dependence among vectors $\vec{x}_1,\ldots,\vec{x}_m$ is an $m$-tuple of scalars $c_1,\ldots,c_m$, not all $0$, such that $c_1\vec{x}_1+ \cdots+c_m\vec{x}_m=\vec{0}$. But any nonzero scalar multiple of $(c_1,\ldots,c_n)$ works just as well, and expresses exactly the same nature of dependence among $\vec{x}_1,\ldots,\vec{x}_m$. Therefore a linear dependence is really an equivalence class of such tuples. One way of putting it is that the space of linear dependences is a projective space.