Assuming the expansion for $(1+x)^{-1}$ prove that $\int_0^1 \frac{x \mathrm d x}{1+x}=\frac12-\frac13+\frac14-\frac15+\cdots$

65 Views Asked by At

Given: $(1+x)^{-1}=1-x+x^2-x^3+\cdots$ for $-1<x<1$, prove that

$\int_0^1 \frac{x \mathrm d x}{1+x}=\frac12-\frac13+\frac14-\frac15+\cdots$

My attempt: I multiplied both sides of $(1+x)^{-1}=1-x+x^2-x^3+\cdots$ for $-1<x<1$ by $x$ to get:

$\frac{x}{1+x}=x-x^2+x^3-x^4+\cdots$ for $-1<x<1$ and then I integrate term by term within $(0,1)$ to get the result.

But my question is, is such a method valid? The multiplication part I mean. If so, why? I have been able to do three similar sums by this method, but I do not understand the reasoning.

1

There are 1 best solutions below

3
On BEST ANSWER

Yes, this is a valid operation. Formally, it is always a valid thing to do, but to show that the sum actually still converges to the right number, you can reexamine the remainder of the finite sum. Operations like this are always valid inside the radius of convergence, though sometimes they will move the interval of convergence (for example, substituting $z-4$ for $x$ changes the interval of convergences to $(3,6)$)