Wikipedia says that Taylor's series is this:
$$f(x) = f(\mu) + \frac{f'(\mu)}{1!}(x-\mu) + \frac{f''(\mu)}{2!}(x-\mu)^2 + \frac{f'''(\mu)}{3!}(x-\mu)^3 + \cdots$$
$$f(x) = \sum \limits_{n=0}^{\infty} \frac{f^{(n)}(\mu)}{n!}(x-\mu)^n$$
However, in the proof, they suddenly switch from "mu" ($\mu$) to "Xi" ($\xi$) on the 2nd term?
then after that they suddenly switch to the variable zeta! ($\zeta$) as in ($g(\zeta) \ge 0$)
I don't get it... how is this taylor's series? what are they doing here... and why is this valid....
the mu ($\mu$) is suppose to be the same for each term according to taylor's series.

They are using Taylor series with remainder, not the infinite Taylor expansion. See the heading 'Explicit form of the remainder' in : https://en.wikipedia.org/wiki/Taylor%27s_theorem