I would like to know why convergence in distribution doesn't imply convergence in probability EXCEPT if the limit is constant.
I can easily see examples of where this holds, and can show that if the limit is constant then the convergences are equivalent.
How do you show that X is constant is the only case where the convergences are equivalent? (If this is true, intuitively I think it is).
Thanks
I think you have this backwards. Convergence in probability in fact does imply convergence in distribution. They aren't equivalent because the converse is false. Convergence in distribution does not imply convergence in probability. For example, if $A\subset\Omega$ has probability $1/2$ then let $X_n = I_A$ for $n$ odd and $X_n = 1-I_A$ for $n$ even. ($I_A$ is the function on $\Omega$ that is $1$ for $\omega\in A$ and $0$ otherwise.) Then these converge in distribution since they all have the same distribution, namely $F(x)=0$ if $x\lt 0 $, $F(x) = 1/2$ if $0\le x\lt 1$ and $F(x)=1$ otherwise. They don't converge in probability to anything since no matter how large $n$ gets, we have $|X_{n+1}-X_n|>1/2$ with probability $1$.
Note that technically even the special case you cite, convergence in distribution to a constant (i.e., for some constant $c$, $F(x)=0$ for $x\lt c$ and $F(x)=1$ otherwise), only implies convergence in probability if the $X_n$ are all defined on the same probability space $\Omega$.