Lagrange multiplier problem

189 Views Asked by At

I am stuck with the following question: Use Lagrange multipliers to determine the shortest distance from a point $x \in R $ to a plane {$y∣b^T y=c$}. Please could someone help me step by step through the problem as I am not even sure where to start with it.

1

There are 1 best solutions below

0
On

Here are a few hints to get you started:

  • To use the method of Lagrange multipliers, you must know ahead of time that the extremum in question exists. Typically in a calc 3 course you would ignore this step. If you do have to show it exists, note that distance increases as $|y| \to \infty$ so you may restrict to a compact set.
  • The point $x$ is fixed ahead of time, so we are trying to find a $y$ on the plane such that $d(x,y)$ is minimum. That is, we are trying to minimize the function $y\longmapsto d(x,y)$ subject to the constraint $b^Ty=c$ (this is the condition that $y$ is on the plane).
  • It will probably be easier to minimize $d^2(x,y)$, do you see why this is equivalent? Call $f(y) = d^2(x,y)$. It may help to write $f(y) = f(y_1,\ldots,y_n)$ to recognize it as a function of several variables.
  • It may help you to write the constraint in more familiar terms, $b^Ty = c$ means $g(y_1,\ldots,y_n)=\sum_{i=1}^n b_iy_i = c$
  • The method of Lagrange multipliers tells us that at the minimum, $\nabla f(y_1,\ldots,y_n) = \lambda \nabla g(y_1,\ldots,y_n) $

I have set up most of the pieces for you, what remains is just a minor calculation. Do you see how to finish the problem from here?