Reasoning why the Taylor series converges to $\operatorname{Si}$

162 Views Asked by At

Let $\operatorname{Si}: \mathbb R \to \mathbb R, \operatorname{Si}(x) := \int^x_0 \frac{\sin t} t \, dt$

Problem find the Taylor series of $\operatorname{Si}$ about $0$. And prove that $\forall x \in \mathbb R$, the Taylor series converges to $\operatorname{Si}$

Steps taken:

The Taylor series is $\sum^\infty_{k=0}\frac{(-1)^{n}x^{2n+1}}{(2n+1)(2n+1)!}$, and I've proven that the series converges through the ratio test, BUT how do I prove that it converges to $\operatorname{Si}(x)$?

Furthermore I am confused by Taylor Series, I have been told that I can only multiply two Taylor series if they are centered at the same point (i.e. at $0$) to treat it as one taylor series. However, why am I able to multiply the known Taylor series of $\sin t $ about $0$ by the Taylor series of $\frac{1}{t}$ about $0$ which doesn't even exist. So how can I treat $\sin(t)\frac{1}{t}$ as one Taylor series about $0$?

Clarity on both topics would be greatly appreciated.

2

There are 2 best solutions below

0
On

Superficially this does not appear to involve complex variables, and for all I know maybe there are routine proofs involving only real variables. However, the answers that I know to these questions are only those that I learned by studying complex variables. With real variables, you can have a function $f$ that has derivatives of all orders at every point, and yet the Taylor series $$ \sum_{n=0}^\infty f^{(n)}(c) (x-c)^n $$ converges to $f(x)$ only when $x=c.$ Such a function cannot be extended to a differentiable function of a complex variable in an open neighborhood of $c.$

However, with complex variables, if a function $f$ has a first derivative at every point in some open neighborhood of $c,$ then it has derivatives of all orders at every point in that open neighborhood of $c,$ and the power series above converges to $f(x)$ for every $x$ within the disk of convergence, and the radius of convergence is at least as big as the radius of the largest disk centered at $c$ that is a subset of the aforementioned open neighborhood. See this article.

Since the power Taylor series of $t\mapsto1/t$ centered at $t=0$ does not exist, the question of multiplying such series does not arise in this case. Note that $$ \frac{\sin t} t = \frac{t - \frac{t^3}6 + \frac{t^5}{120} - \cdots} t = 1 - \frac{t^2} 6 + \frac{t^4}{120} - \cdots. $$ The ratio test shows that this series has an infinite radius of convergence. Another fact from complex variables is that that implies it converges pointwise to a function that is differentiable at every complex value of $t,$ and the derivative can be taken term by term. Clearly what it converges to is $(\sin t)/t$ if $t\ne 0,$ but it is defined at $t=0.$

The derivative of your function $\operatorname{Si}$ is the function defined by this power series. Power series can also be antidifferentiated term by term within their region of convergence.

0
On

It's known that for every $t \in \mathbb{R}$

$$\sin t = \sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!} \cdot t^{2n+1}.$$

Dividing both sides by $t$, we get that for $t \neq 0$

$$\frac{\sin t}{t} = \sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!} \cdot t^{2n}.$$

The equality also holds for $t = 0$ if we assume (by the common convention) that $\frac{\sin t}{t} = 1$ for $t = 0$.

It's easy to check that the above series converges uniformly on $[0, x]$ for every $x > 0$ (and on $[x, 0]$ for every $x < 0$), hence

$$\int \limits_0^x \frac{\sin t}{t} \mathrm{d} \, t = \sum_{n=0}^{\infty} \int \limits_0^x \frac{(-1)^n}{(2n+1)!} \cdot t^{2n} \mathrm{d} \, t = \sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1) \cdot (2n+1)!} \cdot x^{2n+1}$$

and that's it.