Why is this method wrong on calculating this multivariable limit?

298 Views Asked by At

So, I was studying Apostol's book while studying on the site "Brilliant" methods of calculating multivariable limits...

In particular, in $R^2$ we have polar coordinates to switch on and we have: $\lim_{{(x,y)}\to(0,0)}f(x,y) = L$ iff $\lim_{r\to0^+}f(r\cos(\theta),r\sin(\theta)) = L$ since the statement $0\lt\sqrt{x^2+y^2}\lt\delta$ can be translated into $0\lt r \lt \delta$ from the $\epsilon-\delta$ definition of the limit while $x = r\cos(\theta)$ and $y = r\sin(\theta)$ (so the limit exists iff the limit exists in polar coordinates and it's $\theta-independent$) (taken from Brilliant)

But then Apostol came with the following function: $f(x,y) = \frac{xy^2}{x^2+y^4}$ if $x\neq 0$ and $f(0,y) = 0$ and things got messy in my mind because, if we switch to polar coordinates, it becomes $f(r\cos(\theta),r\sin(\theta)) = \frac{r\cos(\theta)\sin^2(\theta)}{\cos^2(\theta)+\sin^4(\theta)}$ if $r$ is different from $0$ and if we make $r\to0$ we'd have have $\lim_{r\to0^+}f(r\cos(\theta),r\sin(\theta)) = 0$

But, if you choose the curve $x = y^2$, we have $f(y^2,y) = \frac{1}{2}$ and so if we approach the origin by that curve we'd have $\lim_{y\to0}f(y^2,y) = \frac{1}{2}$ and by such we'd have the limit approaching $2$ different values which would mean the limit actually doesn't exist

So my doubt is about what is wrong about the procedure using polar coordinates instead of trying different curves, why the polar coordinate method didn't show me that the limit is "angle dependent" (and it doesn't exist in practice)? Did I make any mistakes in the procedure?

3

There are 3 best solutions below

10
On BEST ANSWER

Your very first statement

In particular, in $\Bbb{R}^2$ we have polar coordinates to switch on and we have: $\lim_{{(x,y)}\to(0,0)}f(x,y) = L$ iff $\lim_{r\to0^+}f(r\cos(\theta),r\sin(\theta)) = L$

is not very meaningful yet, because you haven't put a quantifier over $\theta$. I guess you meant the following:

In particular, in $\Bbb{R}^2$ we have polar coordinates to switch on and we have: $\lim\limits_{{(x,y)}\to(0,0)}f(x,y) = L$ iff for every $\theta\in \Bbb{R}$, $\lim\limits_{r\to0^+}f(r\cos(\theta),r\sin(\theta)) = L$.

Even if this is what you meant, it is false, and this is a very common misconception (unfortunately there are several notes which promote the use of polar coordinates for solving limits, without carefully explaining the subtleties).

The implication $\implies$ is true, while the reverse implication is false. This is because if you fix a value of $\theta$, then $\lim_{r\to0^+}f(r\cos(\theta),r\sin(\theta))$ is taking a limit of a single-variable function along a certain straight half-line (i.e it is a one-sided limit along a straight line) which is clearly a much weaker condition than what is actually required ($\lim_{(x,y)\to (0,0)}f(x,y)$ requires the limit to exist regardless of how you approach the origin: straight line, curvy line, zig-zag/criss-cross/oscillatory, whatever).

In fact your function is a perfect example, because it shows that along EVERY straight line to the origin, the limit of the function is $0$, yet despite this the multivariable limit $\lim_{(x,y)\to (0,0)}f(x,y)$ does not exist.


Just to drive the point home, let's write out in terms of quantifiers what each statement means:

  1. $\lim_{(x,y)\to (0,0)}f(x,y) = L$ means:

for every $\epsilon>0$, there is a $\delta>0$ such that for all $(x,y) \in \Bbb{R}^2$, if $0< \lVert (x,y)\rVert<\delta$ then $|f(x,y) - L| < \epsilon$.

  1. For every $\theta\in \Bbb{R}$, $\lim_{r\to 0^+} f(r\cos\theta, r \sin \theta) = L$ means:

For every $\theta\in \Bbb{R}$, for every $\epsilon>0$, there is a $\delta>0$ such that for all $r>0$, if $0<r<\delta$ then $|f(r\cos\theta, r \sin \theta) - L|< \epsilon$.

  1. $\lim_{r\to 0^+}f(r\cos\theta, r \sin\theta) = L$, uniformly in $\theta$, means:

For every $\epsilon>0$ there is a $\delta>0$ such that for all $r>0$ and $\theta\in \Bbb{R}$, if $0<r<\delta$ then $|f(r\cos\theta, r \sin \theta) - L|< \epsilon$.

Note the differences in the statements, especially between 2 and 3 in terms of the quantifiers. We have $(1)\iff (3)$, and $(1)\implies (2)$ (so trivially $(3)\implies (2)$) but your specifc function shows that $(2)\nRightarrow (1)$.

In (1) and (3), the $\delta$ depends only on $\epsilon$, while in $(2)$, the $\delta$ depends on $\theta$ and $\epsilon$ (which is why order of quantifiers matters). Also, if you've seen the concept of uniform continuity, then you'll observe that it is a similar switch in the order of quantifiers which distingusihes between $(2)$ and $(3)$.

0
On

The moral of this: use Lagrange Multipliers to confirm the extreme values of your fraction as the denominator shrinks.

At some point you will learn Lagrange Multipliers. With a denominator such as $x^2 + y^4,$ positive definite but no homogeneous, you can take the denominator as equal to a (positive) number you name $c.$ So, the gradient of numerator $x y^2$ as a row vector is $$ ( y^2, 2xy ) $$ and denominator gives $$ ( 2x, 4 y^3) $$

more in a minute....... If either of $x,y$ is zero (not both) the fraction is zero. The With nonzero variables, the two gradients are parallel when the cross products match, $$ y^2 4 y^3 = 2xy 2x , $$ $$ y^5 = y x^2 $$ $$ y^4 = x^2 , $$ $$ x = \pm y^2 $$ When $x=y^2$ the fraction is $1/2.$ When $x = - y^2$ the fraction is $-1/2.$ Note that there is no explicit value needed for $c,$ in this problem it disappears. In other Lagrange multiplier problems it may be necessary to use terms that depend on the $c.$

0
On

I agree with the answer from @peek-a-boo.

I would like to add this: if you make a different limit for each $\theta$, then you are fixing $\varepsilon>0$ and looking for a $\delta$ such that $$ |f(r\cos\theta,r\sin\theta)-L|<\varepsilon,\qquad\forall 0<r<\delta, $$ but $\delta$ can depend on $\theta,$ i.e. you are not sure that there exists a unique $\delta$ such that the above inequality is true for all $\theta,$ while the definition of limit require that the inequality be true for all points in the neighbourhood, i.e. for each $\theta.$

To be more specific to the case at hand, if you fix $\varepsilon<1/2$, and suppose $\theta\neq k\pi/2,$ you'll find a $\delta_\theta$ satisfying $$ \delta_\theta<\frac{|\cos(\theta)|}{\sin^2(\theta)}\cdot\frac{1-\sqrt{1-4\varepsilon ^2}}{2\varepsilon}$$ as in the following graph, obtained for $\varepsilon=0.1$

enter image description here

As you can see, you cannot find a positive value of $\delta$ such that $$ \delta\leq\delta_\theta,\qquad\forall \theta, $$ in particular because near $\theta=\pi/2+k\pi$ the function approach zero.