Minimum Radius vector of curve

42 Views Asked by At

Find minimum distance from origin of curve $${a^2\over x^2}+{b^2\over y^2}=1$$

I know solution of this using derivative and answer is $a+b$ . But what is wrong with this method

Let some random point of curve $({a\over \cos t},{b\over \sin t})$. since symmetric around all axes consider first quadrant . Now using AM-GM inequality$$r^2={a^2\over \cos^2t}+{b^2\over \sin^2t}\geq \frac{2ab}{\sin t \cos t}$$ $$r^2\geq{4ab \over \sin2t}\geq 4ab$$ $$r\geq 2\sqrt{ab}$$

This seems so legit and intuitive and right . How it's wrong ? How can I avoid such mistake in exam ?

1

There are 1 best solutions below

0
On

To obtain $r\ge 2\sqrt{ab}$, you considered two inequalities as follows $$r^2{={a^2\over \cos^2t}+{b^2\over \sin^2t}\\\ge \frac{2ab}{\sin t \cos t}\\={4ab \over \sin2t}\\\geq 4ab}$$ which, if supposed to be the true minimum of the problem, must lead to a condition that all the inequalities hold with equality simultaneously which is not the case here in general. The reason is that the inequalities ${a^2\over \cos^2t}+{b^2\over \sin^2t}\ge \frac{2ab}{\sin t \cos t}$ and ${4ab \over \sin2t}\geq 4ab$ hold with equality when $|\tan t|=|\frac{b}{a}|$ and $\sin 2t=1$, respectively. These conditions hold simultaneously only if $|a|=|b|$.