How one vector is related to two vectors?

315 Views Asked by At

Imagine that I have a vector $\bf{v}$ which is always between two vectors $\bf{v}_1$ and $\bf{v_2}$. I would like to know how vector $\bf{v}$ can be obtained from $\bf{v_1}$ and $\bf{v_2}$. If we consider $$\bf{v}=\alpha\bf{v_1}+\beta\bf{v_2}$$ I want to find $\alpha$ and $\beta$.

$\alpha$ and $\beta$ can be obtained easily by solving two equations two variables. The problem is when the angle between two vectors $\bf{v_1}$ and $\bf{v_2}$ tends to zero. In this situation, we can say one of the coefficient $\alpha$ or $\beta$ is zero. However, solving two equations is ill conditioned problem (division by zero) and does not give us $\alpha$ or $\beta$ is zero.

I also tried and define my problem as: $$\bf{v}=\lambda\bf{v_1}+(1-\lambda)\bf{v_2}$$ and solve $$\min_{\lambda} ||v||^2$$ but still the problem is ill conditioned.

Do you have any suggestions to solve this problem?

Any vector between two vectors should be defined as these two vectors, but how we should define the problem? The problem is a vector can be defined by size and direction. In my problem size is not important. I only care about the direction.

The values of $\alpha$ and $\beta$ depend on the size of $\bf{v}$. How can I define the problem such that only the direction included in.

enter image description here

1

There are 1 best solutions below

0
On

The most straightforward way to find $\alpha$ and $\beta$ is matrix methods:

$$ \vec{v} = \alpha\vec{v}_1 + \beta\vec{v}_2 = \begin{bmatrix} | & | \\ \vec{v}_1 & \vec{v}_2 \\ | & | \\ \end{bmatrix} \begin{bmatrix} \alpha \\ \beta \end{bmatrix} $$ Where the matrix is created from the vectors $\vec{v}_1$ and $\vec{v}_2$ written as column vectors and the $\alpha$ and $\beta$ coefficients are written as a single vector. Then, $\alpha$ and $\beta$ can be found by fining the inverse of the matrix: $$ \begin{bmatrix} | & | \\ \vec{v}_1 & \vec{v}_2 \\ | & | \\ \end{bmatrix}^{-1}\vec{v} = \begin{bmatrix} \alpha \\ \beta \end{bmatrix} $$ If the matrix is given by $\begin{bmatrix}a & b \\ c & d\end{bmatrix}$, then the inverse is given by $\frac{1}{ad - bc}\begin{bmatrix}d & -b \\ -c & a\end{bmatrix}$. In this case, $\vec{v}_1 = \begin{bmatrix}a\\c\end{bmatrix}$ and $\vec{v}_2 = \begin{bmatrix}b\\d\end{bmatrix}$.

Now, the same division by zero problem occurs when $\vec{v}_1$ and $\vec{v}_2$ are parallel. You will need to check for this before finding the matrix inverse. One way to check for this is to see if $$\frac{|\vec{v}_1 \times \vec{v}_2|}{|\vec{v}_1||\vec{v}_2|} < \epsilon,$$ where the $\times$ operation refers to the vector cross product (equivalent the matrix determinant $ad-cd$), the vertical bars indicate the magnitude of the vector, and $\epsilon$ is some small positive number (much less than one) that will depend on your application.

If your vectors are near parallel ($\epsilon$ is too small), then you need to switch to a different method. Assume you just want $\alpha$: \begin{align} \alpha\vec{v}_1 &= \vec{v} \\ \alpha|\vec{v}_1|^2 &= \vec{v}_1\cdot\vec{v} \\ \alpha &= \frac{\vec{v}_1\cdot\vec{v}}{|\vec{v}_1|^2} \end{align} Here, the $\cdot$ refers to the vector dot product.