prove jensen's inequality using taylor's series, why is zeta different than mu?

348 Views Asked by At

proving jensen's inequality using taylor's series

Wikipedia says that Taylor's series is this:

$$f(x) = f(\mu) + \frac{f'(\mu)}{1!}(x-\mu) + \frac{f''(\mu)}{2!}(x-\mu)^2 + \frac{f'''(\mu)}{3!}(x-\mu)^3 + \cdots$$

$$f(x) = \sum \limits_{n=0}^{\infty} \frac{f^{(n)}(\mu)}{n!}(x-\mu)^n$$

However, in the proof, they suddenly switch from "mu" ($\mu$) to "Xi" ($\xi$) on the 2nd term?

then after that they suddenly switch to the variable zeta! ($\zeta$) as in ($g(\zeta) \ge 0$)

I don't get it... how is this taylor's series? what are they doing here... and why is this valid....

the mu ($\mu$) is suppose to be the same for each term according to taylor's series.

2

There are 2 best solutions below

2
On

They are using Taylor series with remainder, not the infinite Taylor expansion. See the heading 'Explicit form of the remainder' in : https://en.wikipedia.org/wiki/Taylor%27s_theorem

0
On
  • Taylor's Series ($n = \infty$)

    $f(x) = f(c) + \frac{f'(c)}{1!}(x-c) + \frac{f''(c)}{2!}(x-c)^2 + \frac{f'''(c)}{3!}(x-c)^3 + \cdots$

    $f(x) = \sum \limits_{i=0}^{n=\infty} \frac{f^{(i)}(c)}{i!}(x-c)^i$

  • Taylor's Series with remainder ($n = \text{finite number}$)

    Let f and its derivatives $f', f'', \cdots, f^{n}$ exist and be continuous in a closed interval $a \le x \le b$, and suppose that $f^{(n+1)}$ exists in the open interval $a<x<b$, then for c in $[a,b]$:

    $f(x) = \underbrace{\bigg[\sum \limits_{i=0}^{n} \frac{f^{(i)}(c)}{i!}(x-c)^i\bigg]}_{P_n(x)} + \underbrace{\bigg[ \frac{1}{(n+1)!}f^{(n+1)} (\xi)(x-c)^{n+1} \bigg]}_{R_n(x)}~~~~\text{for } c < \xi < x$

    $f(x) = P_n(x) + R_n(x)$

    where:

    • $P_n(x)$ is the Taylor's Polynomial:

    $P_n(x) = f(c) + f'(a)(x-c) + .. \frac{1}{n!}f^{(n)}(c)(x-c)^n$ ​​ $​P_n(x) = \sum \limits_{n=0}^{\infty} \frac{f^{(n)}(c)}{n!}(x-c)^n$

    • $R_n(x)$ is the Lagrange Form Remainder:

      $R_n(x) = \frac{1}{(n+1)!}f^{(n+1)} (\xi)(x-c)^{n+1}~~~~\text{for } c < \xi < x$

  • Taylor's Series with remainder (n=1):

    Let f and its derivatives $f'$, and $f''$ exist and be continuous in a closed interval $a \le x \le b$, and suppose that $f'''$ exists in the open interval $a<x<b$, then for c in $[a,b]$:

    $f(x) = \underbrace{\Big[f(c) + f'(c)(x-c)\Big]}_{P_1(n)} + \underbrace{\Big[\frac{1}{2}f'' (\xi)(x-c)^2\Big]}_{R_1(n)}~~~~\text{for } c < \xi < x$


moving on:

If a function is convex, this means that its second derivative is either zero or a positive number:

$f(x)$ is convex if $f''(X) \ge 0$

returning to this taylor's series expansion:

$f(x) = \underbrace{\Big[f(c) + f'(c)(x-x)\Big]}_{P_1(n)} + \underbrace{\Big[\frac{1}{2}f'' (\xi)(x-c)^2\Big]}_{R_1(n)}~~~~\text{for } c < \xi < x$

since we already know that f(x) is convex, that means that $R_1(n)$ is a positive number or equal to zero. that is:

$R_1(x) \ge 0$

Thus if we remove $R_1(x)$ from the equality:

$f(x) = P_1(x) + R_1(x)$

then we get the inequality:

$f(X) \le P_1(X)$

$f(X) \le f(c) + f'(c)(X-c)$

now taking the expectation of each side

$E[f(X)] \le E[f(c) + f'(c)(X-c)$

$E[f(X)] \le f(c) + f'(c)(E[X] - c)$

now let $c = E[X]$

$E[f(X)] \le f(E[X]) + f'(c)(E[X]-E[X])$

$E[f(X)] \le f(E[X])$

thus proving Jenson's inequality.