While I was studying Taylor series, I had this one problem; how can I prove that $\sum_{k=0}^{\infty}{f^{(k)}(0)\frac{x^k}{k!}}=f(x)$
It didn't seem why they "must" be equal, though I had the feeling that they would. As far as I know, we first 'Let'$f(x)=\sum_{k=0}^{\infty}{c_k(x-a)^k}$. I can't understand why this 'Letting' is valid.
P.S. I'm doing this in order to make Euler's proof of the Bazel problem more strict.
Since $\sin(x)=\sum_{n=0}^{\infty}{\frac{(-1)^{n+1} x^{2n-1}}{(2n-1)!}}$, and $\sin(x)=0 $ has its root only when $x=\pm n\pi(n\ge0, n \in\mathbb{N})$ we can say that $$\sin(x)=\sum_{n=0}^{\infty}{\frac{(-1)^{n+1} x^{2n-1}}{(2n-1)!}}=x\prod_{n=1}^{\infty}{(1-\frac{x^2}{(\pi n)^2})}$$Therefore $$-\frac{1}{6}=-\frac{1}{\pi^2}\sum_{n=1}^{\infty}{\frac{1}{n^2}}$$
I am having trouble logically saying that $\sin(x)=\sum_{n=0}^{\infty}{\frac{(-1)^{n+1} x^{2n-1}}{(2n-1)!}}=x\prod_{n=1}^{\infty}{(1-\frac{x^2}{(\pi n)^2})}$ works, and in this question explain the first equation.(I've proved that $\sin x$=0 has no complex roots.)
In general, the Taylor series of a function might not be equal to the function itself, even given the conditions necessary for the Taylor series to be defined. To trivially generate examples, just take any ordinary function that is equal to its Taylor series, and just arbitrarily change it's value somewhere away from 0 (or whichever other point you're considering the Taylor series at).
So for the the thing you're trying to prove to even be true, you need to assume the function comes from a particular class of functions for which it is true. The proof is then going to depend on what this particular class of functions is.
In the case of the most obvious choice, analytic functions, the proof is trivial, since the definition of an analytic function is that it can be written as a power series and all you have to show is that the power series is the Taylor series.