How to sum a infinite divergent series that has a term from the end (infinity)

166 Views Asked by At

From my physical problem, I ended up having a sum that looks like the following.

$$ S(\omega) = \sum_{p = 1}^{N-1} p \exp{\left(-\frac{(N - p)^2\sigma^2}{2}\right)} \cos{\left(\left(\mu - \omega\right)\left(N-p\right)\right)} $$

I want to know what is the sum when $N \to \infty$. Here, $\omega$ is where this is computed and $\mu$ and $\sigma$ are constants. Can this be reduced to an expression (a function of variables $\omega$, $\mu$ and $\sigma$) ?

================ EDIT with respect to the accepted answer ==================

In my physical problem, I realized that I had to take an average with $N$. So the new function (with different parametrization provided by the answer ) looks like,

$$ S_N(\omega) = \sum_{q = 1}^{N-1} \left(1 - \frac{q}{N}\right) \exp{\left(-\frac{q^2\sigma^2}{2}\right)} \cos{\left(\left(\mu - \omega\right)q\right)} $$

Then, following the answer, we have,

$$ S_N(\omega) - S_{N - 1}(\omega) = \sum_{q = 1}^{N-1} q(\frac{1}{N-1} - \frac{1}{N}) \exp{\left(-\frac{q^2\sigma^2}{2}\right)} \cos{\left(\left(\mu - \omega\right)q\right)} $$

I suppose this difference goes to $0$ if $N \to \infty$?

1

There are 1 best solutions below

5
On BEST ANSWER

Actually, this limit won't even exist most of the time!

Applying a change of variables $q = N-p$, and adding a subscript $S_N$ to show the dependence on $N$, we get: $$ S_N(\omega) = \sum_{q = 1}^{N-1} (N-q) \exp{\left(-\frac{q^2\sigma^2}{2}\right)} \cos{\left(\left(\mu - \omega\right)q\right)} $$

Then we can compute $$ S_N(\omega) - S_{N-1}(\omega) = \sum_{q=1}^{N-1} \exp{\left(-\frac{q^2\sigma^2}{2}\right)} \cos{\left(\left(\mu - \omega\right)q\right)} $$

For the limit of $S_N(\omega)$ to exist as $N \to \infty$, we definitely need the RHS to approach 0 as $N \to \infty$, i.e. we need

$$ 0 = \sum_{q=1}^{\infty} \exp{\left(-\frac{q^2\sigma^2}{2}\right)} \cos{\left(\left(\mu - \omega\right)q\right)}. $$

On the other hand, there's no way that series will equal $0$ for all choices of $\omega$, $\sigma$, $\mu$. One easy counterexample is to pick $\omega = \mu$, in which case all terms of the series become positive $\implies$ the sum is definitely $> 0$. But anyway the point is that the limit of the $S_N$ functions is not in general defined if you just randomly pick some $\omega, \sigma$, and $\mu$.

Edit in response to the edited question

The same reasoning does show that $S_N(\omega) - S_{N-1}(\omega) \to 0$ as $N \to \infty$, but that's not quite strong enough to prove what you want. If the differences don't approach 0 then there's no chance for the sequence to converge, but if they do approach 0 then the sequence still might not converge.

Anyway, the new functions will in fact converge. We can prove this by bounding the absolute values of the terms in the sum; basically it comes down to the functions converging because $\sum_{q=1}^\infty \exp \left( - \frac{q^2 \sigma^2}{2} \right)$ converges.

If you want to get into more detail than that on a followup question then it'd probably be best to post a separate question for the followup (and link back to this one for context). That helps keep the website better organized, and it lets people easily see that the followup doesn't have an answer yet even though your original question did get answered.