Limit of $\exp(f(x))$ implies existence of limit of $f(x)$

101 Views Asked by At

My question comes from this observation: If $f(x)$ is a complex-valued, continuous function on interval $(0,1)$, and we know that limit of $f(x)$ at $x=1$ exists and finite, then we know that limit of $\exp(f(x))$ at $x=1$ exists, since $exp(x)$ function is continuous.

I guess that the converse still holds: if we know that limit of $\exp(f(x))$ at $x=1$ exists and it is not 0, then limit of $f(x)$ at $x=1$ exists. But I am stuck here, since the logarithmic function is not single-valued. Thus, even if we know the limit of $\exp(f(x))$ at $x=1$, we don't even know what the limit of $f(x)$ at $x=1$ could be. How do I approach this problem?

Any help is appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

The heart of the problem is to show that when $e^x$ and $e^y$ are close to each other, then there exists an integer $n$ such that $x$ and $y + 2 \pi i n$ are close to each other. This can be done by showing that for a suitable branch of the logarithm, $\log e^x$ and $\log e^y$ are close to each other. Continuity of $f$ will imply that the $n$'s that appear are $0$, and the existence of the limit of $f$ at $1$ will follow from Cauchy's criterion.

Formally, let $a$ be the limit of $e^{f(x)}$ at $1$. Let $\epsilon > 0$. Then there exists $\delta > 0$ such that $x \in (1-\delta, 1)$ we have $|e^{f(x)} - a| < \epsilon$. In particular for $x, y \in (1-\delta, 1)$ we have $|e^{f(x)} - e^{f(y)}| < 2 \epsilon$.

Take a branch of the logarithm such that the branch cut does not contain $a$, and call it $\log$. Take any $\epsilon' > 0$. Then there exists $\delta' > 0$ such that $|e^{f(x)} - e^{f(y)}| < \delta'$ implies $|\log e^{f(x)}- \log e^{f(y)}| < \epsilon'$.

Now take $\epsilon = \delta' / 2$ above. Then for $x, y \in (1-\delta, 1)$ we have $|\log e^{f(x)}- \log e^{f(y)}| < \epsilon'$. In particular, there exists an integer $n$ depending on $x, y$ and $\epsilon'$. such that $|f(x) - f(y) + 2 \pi i n| < \epsilon'$.

We show that $n$ is $0$ as soon as $\epsilon'$ is small enough. Clearly when $\epsilon' < \pi$ there is at most one such $n$ for every $x$ and $y$, which consequently does not depend on $\epsilon'$. That, is taking $\epsilon' = 0.1$ we obtain a function $n(x, y)$ defined for $x, y \in (1-\delta, 1)$.

We show that $n$ is locally constant. Indeed, by continuity of $f$, if $x$ or $y$ are changed a tiny bit, say to $x', y'$, the inequality $|f(x') - f(y') + 2 \pi i n(x, y)| < 0.1$ still holds.

Because $n(x, y)$ is a locally constant function on $(1-\delta, 1) \times (1- \delta, 1)$, it is constant. Taking $x = y$ shows that the constant is $0$.

We have now shown that for all $\epsilon' > 0$ there exists $\delta > 0$ such that $x, y \in (1-\delta, 1)$ implies $|f(x) - f(y)| < \epsilon'$. By Cauchy's criterion, $f$ has a limit at $1$.