Prove the following inequality involving a sum.

28 Views Asked by At

Suppose that $m,n, q\in \mathbb{N}$ such that

$$\lambda_{n,m}=\frac{m+1/2}{(m+1/2)^2−n^2} \text{ and } \sigma_{q,m} = \sum_{k=0}^q \lambda_{k,m}.$$ Furthemore we also know that, $$\frac{\lambda_{0,m}}{2}+\sum_{n=1}^{\infty}\lambda_{n,m}=0.$$

Then I want to show that when $q> m$ then

$$2 \sigma_{q,m} \geq \lambda_{0,m} = \frac{1}{m+1/2}.$$

From the infinite sum, we get that $\sigma_{q,m}\to \frac{\lambda_{0,m}}{2}$ as $q\to \infty.$

Next, we observe that since $q>m$ we have that $$\sigma_{q+1,m}-\sigma_{q,m} = \lambda_{q+1,m} = \frac{m+1/2}{(m+1/2)^2−q^2}<0$$ and so $\sigma_{q,m}$ is a decreasing sequence. How do I proceed after this step?

1

There are 1 best solutions below

0
On

As $\sigma_{q,m}$ is decreasing, we have $\sigma_{q,m} > \sigma_{q+1,m} > \sigma_{q+2,m} > ...$. So, as $q\to\infty$, we get $\sigma_{q,m} > \frac{\lambda_{0,m}}{2}$