Prove equation using taylor series

106 Views Asked by At

Given $f(x)$, knowing that $f'(x)$ and $f''(x)$ exist for every $0\leq x\leq1$, and provided I know that $f(0)=f(1)$ and that for each $0\leq x\leq1$, $|f''(x)|\leq A$, how can I prove that $|f'(x)|\leq A/2$ for every $0\leq x\leq1$?

1

There are 1 best solutions below

0
On BEST ANSWER

Well, this answer probably isn't correct since it presumes that third and further derivations of $\ f(x) $ all equal zero on interval [0, 1]. But still, maybe it will give you some idea to compete this proof.

We write $\ f(x) = \sum\limits_{i=1}^n \frac { f^{(n)} (0) }{ n! } x^n $ , the Taylor expansion of $\ f(x) $ which is an exact representation of the function.

With my newly added assumption $\ f(x) = f(0) + f'(0)x + f''(0)x^2/2 $ and with the fact that $\ f(0) = f(1) = f(0) + f'(0) + f''(0) / 2 $ . Some terms cancel out to obtain: $\ f'(0) = - f''(0) / 2 \implies f''(0) = - 2f'(0) $

If we put $\ g(x) = f'(x) $ then Taylor expansion for g(x) is $\ g(x) = f'(x) = f'(0) + f''(0) x = f'(0) (1 - 2x) $

(*) Since $\ f''(x) = (f'(x))' = -2f'(0) = A \implies | f''(x) | \leq A$ we can write $\ f'(x) = f'(0) - 2xf'(0) = -A/2 + Ax = A(x - 1/2) $ and since $\ 0 \leq x \leq 1 $ the inequality $\ | f'(x) | = A | x - 1/2 | \leq A/2 $ always holds true which ends the proof.

A few comments in the end. In the (*) part we obtained our given assumption again. I think that the more general proof should use that assumption instead of deriving it because deriving it is only possible by adding further restraints to your problem. So you should probably start from there. Hopefully you can work on this and solve it :)