Analyticity Proof

39 Views Asked by At

Suppose $f$ is analytic on $(a, b)$. Prove that if $(c, d)$ is a subinterval of $(a, b)$ and $f(x) = 0$ for all $x$ in $(c, d)$, then $f(x) = 0$ for all $x$ in $(a, b)$.

1

There are 1 best solutions below

0
On

Since this seems to be an exercise I'll give you the technical setup and leave it to you to figure out the core conclusion:

Let $(e,f)$ the maximal open subinterval on which $f=0$. Then, if e.g. $e\neq a$, $f(e)= 0$ by continuity. Since $f$ is analytic it admits a Taylor series expansion around $e$, $$f(x) =\sum_{k=0}^\infty c_k (x-e)^k$$

Since $f(x)=0$ for $x>e$ in a neighbourhood of $e$ we want to use this fact to show that $c_k= 0$ for each $k$. This will then imply $f(x)= 0$ also for $x<e$ near $e$, contradicting the minimality of $e$ with this property. Now this is what I leave for you to prove.