How to prove the existence of a mean value when it is difficult to constructing a primitive function?

41 Views Asked by At

Suppose $f\in\mathscr{C}\big([0,1]\big)$ can be differentiated in $(0,1)$, and we also have $f(0)=1,f(1)=e/2$. The target is to prove that there exists $\xi\in(0,1)$ such that: $$ e^\xi f'(\xi)-e^\xi f(\xi)+f^2(\xi)=0~~~~(*) $$

If $f(x)\neq0$ for every $x\in(0,1)$, applying Lagrange Mean Value Theorem on $$ F(x)=\frac{e^x}{f(x)} $$ solves the problem immediately.

I guess the condition $f(x)\neq0~~\big(x\in(0,1)\big)$ is necessary since $F$ now have poles at zeros of $f$, which fails to meet the requirements of Lagrange Theorem. However, I can't construct a counter example successfully and I am tending to believe the conclusion holds even without that extra condition. But in both sides, I can't come up with a proof. So is $(*)$ true under the oringinal condition?

1

There are 1 best solutions below

1
On BEST ANSWER

Yes, $(*)$ has a solution even if $f$ has zeros in $[0, 1]$. Let $c$ be the point where $f$ attains its minimum on $[0, 1]$.

If $f(c) > 0$ then $f$ is positive everywhere, and the mean-value theorem applied to $e^x/f(x)$ shows that $(*)$ has a solution, as you correctly argued.

If $f(c) = 0$ then also $f'(c) = 0$, so that $\xi = c$ is a solution.

It remains to consider the case that $f(c) < 0$. Let $(a, b)$ be the largest interval containing $c$ on which $f$ is negative, i.e. $$ a = \max \{ x \in [0, c] \mid f(x) = 0 \} \, ,\\ b = \min \{ x \in [c, 1] \mid f(x) = 0 \} \, .\\ $$ Then $f(a) = f(b) = 0$ and the function $$ h(x) = \frac{e^x}{f(x)} - x $$ is defined and negative on $(a, b)$ with $$ \lim_{x \to a+} h(x) = \lim_{x \to b-} h(x) = -\infty \, . $$ It follows that $h$ attains its maximum at some point $d \in (a, b)$. Then $h'(d) = 0$ and $\xi = d$ is a solution of $(*)$.