Stuck on an equation to find a point on the intersection of two planes that is closest to the origin,

1.2k Views Asked by At

Edit 2: I have solved it, in the sense that I found one critical point, but it is in a pretty nasty form, so I wouldn't know how to justify that it is indeed closest to the origin. Please feel free to comment on this - thanks for reading.

EDIT: I feel that I am pretty close to a solution, using the Lagrange Multiplier method. I am seeking a way to proceed from where I am currently stuck, rather than use another method to solve this minimization problem. Thanks,

The problem statement is:

Find the point on the intersection of the two planes $a_0+a_1x+a_2y+a_3z=0$ and $b_0+b_1x+b_2y+b_3z=0$ which is nearest to the origin $(0,0,0)$.

My work:

The objective function, thinking of distance to the origin, should be the tricky $\sqrt{x^2 + y^2 +z^2}$, but since the (positive) square root is a monotone increasing function, minimizing this function is the same as minimizing the easier $f(x,y,z) = x^2 + y^2 + z^2$. We use this as our objective function, subject to two constraints,

$g_1(x,y,z) = a_1x+a_2y+a_3z=-a_0$,

$g_2(x,y,z) =b_1x+b_2y+b_3z= -b_0$.

So, by the method of Lagrange Multipliers, I want to solve the equations

$$\nabla f(x,y,z) = \lambda_1 \nabla g_1(x,y,z) + \lambda_2 \nabla g_2(x,y,z)$$

$$\implies (2x,2y,2z) = \lambda_1 (a_1,a_2,a_3) + \lambda_2 (b_1,b_2,b_3)$$

$$\implies (2x) = \lambda_1 (a_1) + \lambda_2 (b_1)$$ $$\implies (2y) = \lambda_1 (a_2) + \lambda_2 (b_2)$$ $$\implies (2z) = \lambda_1 (a_3) + \lambda_2 (b_3)$$

now, multiplying by x to the first equation, y to the second equation, z to the third equation, and then adding all three equations and using the two constraints, gives me

$$2(x^2 + y^2 + z^2) = \lambda_1(-a_0) + \lambda_2(-b_0)$$

I am currently stuck here. How could I proceed?

Any hints or suggestions are welcome.

Thanks,

3

There are 3 best solutions below

0
On

I would like to give you a hint because it is pretty easy.

First of all, you need to find the intersection of the two planes (a line). Because you have 2 equations with 3 variables, you can reduce it to 1 equation with 2 variables. Do not forget to impose conditions to make sure two planes do not parallel each other.

Secondly, when you have a line in which the nearest point of the origin belong to, then, you need to calculate the distant between the origin and a point on the calculated line and find the minimum distant to get the solution.

Finally, using this solution to find the last point by replacing it to one of the two given planes.

0
On

I would like leave a comment. Being a beginner in StackExchange, my reputation is too low to do so. You may solve the equations so that one variable is eliminated and parametrize the solution. As the distance between (0,0,0) and the solution set is given by Pythagoras theorem. you may find the required point by completing square or differentiation.

0
On

you may be able to do this without the use of lagrange method. let $n_1, n_2$ are the unit normal vectors of the two planes. first you can deal with the easier case of $n_1 = \pm n_2.$ so wlog we take $n_1 \neq \pm n_2.$

suppose that the planes $$\frac{a_0}{\sqrt{a_1^2 + a_2^2 + a_3^2}}+ x\cdot n_1\, = 0, \frac{b_0}{\sqrt{b_1^2 + b_2^2 + b_3^2}} + x \cdot n_2 = 0 $$ cut at the line $$x = \alpha n_1 + \beta n_2 + k (n_1 \times \space n_2)\text{ where $k$ is a real parameter. }$$ then $\alpha, \beta$ are determined by $$\frac{a_0}{\sqrt{a_1^2 + a_2^2 + a_3^2}} + \alpha + n_1 \cdot n_2 \,\beta = 0\\\frac{b_0}{\sqrt{b_1^2 + b_2^2 + b_3^2}} + n_1 \cdot n_2 \, \alpha + \beta = 0 $$

we also have $$|x|^2 = x \cdot x = \alpha^2 + 2 \alpha \beta n_1 \cdot n_2 + \beta^2 +k^2|n_1 \times n_2|^2 $$ which has a minimum of $$\alpha^2 + 2 \alpha \beta n_1 \cdot n_2 + \beta^2 $$