Convergence in distribution and integration

274 Views Asked by At

Let $X_n\in\mathbb{R}$ with $n\in\mathbb{N}$ be a sequence of random variables converging in distribution to $X$. Then, for any bounded Lipschitz function $f$ we have that $\lim_{n\rightarrow\infty}E[f(X_n)]=E[f(X)]$.

Can we show that $$ \lim_{n\rightarrow\infty} \int_0^\infty E[g(y,X_n)]dy = \int_0^\infty E[g(y,X)]dy $$ where $g$ is a bounded Lipschitz function with $\int_0^\infty g(y,x)dy<\infty$ for any finite $x$.

If not, what additional conditions are needed.

1

There are 1 best solutions below

1
On BEST ANSWER

Here is a counter-example (modified from my comments): Take the deterministic case $X=0, X_n=1/n$ for $n \in \{1, 2, 3, ...\}$.

It will be easier to define $g(x,y)$ over the domain $D = \{(x,y): x \in [0,1], y\geq 0\}$: $$ g(x,y)=xe^{-xy} \quad \forall (x,y) \in D$$ Then: $$ \int_{0}^{\infty} g(x,y)dy = \left\{ \begin{array}{ll} 0 &\mbox{ if $x =0$} \\ 1 & \mbox{ if $x>0$} \end{array} \right.$$ Thus $$ \int_0^{\infty} g(X,y)dy =0$$ But $$ \int_0^{\infty} g(X_n, y)dy = 1 \quad \forall n \in \{1, 2, 3, ...\}$$ Note $g$ is bounded because $0\leq g(x,y)\leq 1$ for all $(x,y)\in D$. Finally, note that $g$ is Lipschitz continuous (see Note below). $\Box$


Note: To show $g$ is Lipschitz we note it is continuously differentiable and for all $(x,y) \in D$: $$[\partial g/\partial x; \partial g/\partial y] = [(1-xy)e^{-xy}; -x^2e^{-xy}] $$ and so \begin{align} ||[\partial g/\partial x; \partial g/\partial y]||^2&= (1-xy)^2e^{-2xy} + x^4e^{-2xy}\\ &\overset{(a)}{\leq} (1-xy)^2e^{-2xy} + e^{-2xy}\\ &\overset{(b)}{\leq} \sup_{t\geq 0} \left\{(1-t)^2e^{-2t} + e^{-2t}\right\}\\ &\overset{(c)}= 2 \end{align} where (a) holds because $x \in [0,1]$ so $x^4\leq 1$; (b) holds because $xy \geq 0$; (c) holds because the supremum is achieved at $t=0$.