Find the linear variety sum of $L_1$ and $L_2$(i.e., the smallest linear variety containing both $L_1$ and $L_2$)

26 Views Asked by At

Consider the linear varieties $L_1$ and $L_2$ in the Euclidean space $\mathbb{R}^3$, given by

$$L_1: \pmatrix{x\\y\\z}=P_1+tV_1=\pmatrix{1\\1\\2}+t\pmatrix{2\\1\\1}$$

and $$L_2: \pmatrix{x\\y\\z}=P_2+sV_2=\pmatrix{2\\1\\2}+s\pmatrix{\alpha\\2\\\beta}$$ with $t,s\in\mathbb{R}$.

Check whether there are $\alpha$ & $\beta$ $\in \mathbb{R}$ such that $L_1$ and $L_2$ are orthogonal(perpendicular) and concurrent and find the equation of the linear affine variety sum of $L_1$ and $L_2$(i.e., the smallest linear affine variety containing both $L_1$ and $L_2$) in this case.

My Solution:

I wrote the cartesian equation of each linear variety as follows: $$L_1:\begin{cases} x-2z=-3\\ y-z=-1\end{cases}$$ $$L_2:\begin{cases} 2x-\alpha y=4-\alpha\\ -\beta y+2z=4-\beta\end{cases}$$ Now, for the lines to be perpendicular, we need to have $\langle dir(L_1),dir(L_2)\rangle=0 \rightarrow 2\alpha+\beta+2=0$

and for them to be concurrent i used rank of matrices: \begin{align*} rank \begin{bmatrix} 1 & 0 & -2 & -3\\ 0 & 1 & -1 & -1\\ 2 & -\alpha & 0 & 4-\alpha\\ 0 & -\beta & 2 & 4-\beta\\ \end{bmatrix}=rank \begin{bmatrix} 1 & 0 & -2 \\ 0 & 1 & -1\\ 2 & -\alpha & 0\\ 0 & -\beta & 2\\ \end{bmatrix}=3 \end{align*} Now, the confusion comes in:

From \begin{align*}rank \begin{bmatrix} 1 & 0 & -2 \\ 0 & 1 & -1\\ 2 & -\alpha & 0\\ 0 & -\beta & 2\\ \end{bmatrix}=3 \end{align*} I obtain that $\beta≠2$ and $\alpha≠4$, whereas from the rank of the other matrix i obtain that $\beta=2$.

If I were to find $\alpha$ and $\beta$ suited for my question I would continue like this:

$dir(L_1+L_2)=dir(L_1)+dir(L_2)+\langle \overrightarrow{P_1P_2}\rangle$ and write the equation of the Linear Affine Variety sum between $L_1$ and $L_2$

How should i find the values of those parameters?(maybe in general cases when studying relative positions between 2 affine subspaces)

1

There are 1 best solutions below

2
On BEST ANSWER

You've made a good progress showing that parameters $\alpha, \beta$ should satisfy $2\alpha + \beta+2=0$.

Now, what you need to think of is following. When two orthogonal lines are concurrent, they have an intersection, i.e. there exist $t, s$ such that: $$ L_1(t)= \begin{bmatrix} 1\\1\\2 \end{bmatrix} + t\begin{bmatrix} 2\\1\\1 \\ \end{bmatrix} =\begin{bmatrix} 2\\1\\2 \end{bmatrix} + s\begin{bmatrix} \alpha\\2\\-2\alpha-2 \end{bmatrix} =L_2(s) $$

Now you get 3 equations for 3 variables $\alpha, t, s$. They actually have a unique solution. You get $\alpha$ from there.