Why does this method of assigning a value to a divergent series fail?

72 Views Asked by At

I was trying to figure out a way to assign values to divergent sums and found this interesting identity. Here are the steps I took to get it:

$$\frac{1}{1-x} = \sum_{n=0}^\infty x^n$$ $$\frac{1}{1-e} = \sum_{n=0}^\infty e^n$$ $$\frac{1}{1-e} = \sum_{n=0}^\infty \sum_{k=0}^\infty \frac{n^k}{k!}$$ I believe this step is illegal, since you can't always exchange infinite sums) $$\frac{1}{1-e} = \sum_{k=0}^\infty \frac{1}{k!} \sum_{n=1}^\infty n^k$$ $$\frac{1}{1-e} = \sum_{k=0}^\infty \frac{1}{k!} \zeta(-k)$$

For $x=-e$ this process is similar, except that $$\frac{1}{1+e} = \sum_{k=0}^\infty \frac{1}{k!} \sum_{n=0}^\infty (-1)^nn^k$$ Since the eta function is defined with the starting index of 1, we have that $$1-\frac{1}{1+e} = \sum_{k=1}^\infty \frac{1}{k!} \sum_{n=1}^\infty (-1)^nn^k$$ $$-\frac{e}{1+e} = \sum_{k=1}^\infty \frac{1}{k!} \eta(-k)$$

But, if x is not either of these the sum doesn't seem to converge properly? $$\frac{1}{1-x} = \sum_{n=0}^\infty x^n$$ $$\frac{1}{1-x} = \sum_{n=0}^\infty e^{\ln(x)n}$$ $$\frac{1}{1-x} = \sum_{n=0}^\infty \sum_{k=0}^\infty \frac{\left(\ln(x)n\right)^k}{k!}$$ $$\frac{1}{1-x} = \sum_{k=0}^\infty \frac{\ln(x)^k}{k!} \zeta(-n) $$ doesn't work, and neither does $$\frac{1}{1-x} = \sum_{k=0}^\infty \frac{1}{k!} \zeta(-\ln(x)n) $$

On the hand, it seems that $$1-\frac{1}{1+x} = \sum_{k=0}^\infty \frac{\ln(x)^k}{k!} \eta(-n) $$ does work even if x>e.

I'm curious what causes the sum to converge properly, or what causes it to fail. How would I go about proving where this sum converges properly?