The result looks obvious, but I meet difficulty in proving this: Suppose $X$ is a non-negative real-valued discrete random variable and $P(X=a)=k>0$. This $a$ is the largest one such that $P(X=a)>0$.
I need to show that if the r.v. is changed so that $P(X=b)=k$ for some $b>a$ and $P(X=a)=0$ (and this is the only change to $X$), then $\mathrm{Var}(X)$ will be larger. (So, basically, what I am saying is that I increase the value of $X$ at some domain of this $X$ and want to show that this causes the variance increase.)
The difficulty I met was that if I simply use the formula for calculating variance, then after the change of $X$, both $E(X^2)$ and $(E(X))^2$ increase and I couldn't show the difference of them becomes larger.
Edit: I forgot to write one key requirement. When I said changing $P(X=a)=k>0$ above, this $a$ is the largest one such that $P(X=a)>0$.
Thank you for any help.
If the distribution is on values $x_1,x_2,...$, then the effect (on the variance) of increasing one of these values (say $x_j$) can be found by looking at the partial derivative w.r.t. $x_j$: $$\begin{align}\frac{\partial}{\partial x_j}V(X) &= \frac{\partial}{\partial x_j} \left(\sum_i p_i x_i^2 - (\sum_i p_i x_i )^2 \right)\\ &= \sum_i p_i 2 x_i \delta_{ij} - 2(\sum_i p_i x_i)\sum_i p_i \delta_{ij} \\ &= 2p_jx_j - 2(\sum_i p_i x_i)p_j\\ &= 2p_j(x_j-E(X)) \end{align}$$ which is greater than $0$ iff $x_j > E(X)$.
If you're not familiar partial differentiation, then you can just do this arithmetically. Let $X$ be the original r.v., and $X'$ the result of changing $x_j\rightarrow x_j+\delta$; then we have the following:
$$\begin{align}V(X') &= \sum_{i\ne j}p_i x_i^2 + p_j(x_j+\delta)^2 - \left(\sum_{i\ne j} p_i x_i + p_j(x_j+\delta) \right)^2\\ &=\left( \sum_{i\ne j}p_i x_i^2 + p_j x_j^2\right) + p_j(2x_j+\delta)\delta - \left( ( \sum_{i\ne j} p_i x_i + p_j x_j ) + p_j\delta\right)^2\\ &= \sum_i p_i x_i^2 + p_j(2x_j+\delta)\delta - \left(\sum_i p_i x_i \right)^2 - 2(\sum_i p_i x_i)p_j\delta - (p_j\delta)^2\\ &= V(X) + 2p_j \delta x_j - 2p_j \delta\sum_i p_i x_i + p_j(1-p_j)\delta^2\\ &=V(X) + 2p_j \delta\left(x_j - E(X) \right)+ p_j(1-p_j)\delta^2 \end{align} $$ so, again, the variance increases for any $\delta>0$ iff $x_j > E(X)$.