Expectation of a transformed random variable

55 Views Asked by At

I'm trying to prove the following:

Let $X_n$ be a sequence of positive random variables and $g$ be a positive function. Suppose that $E[X_n]\to \infty$ as $n\to\infty$. If $E[g(X_n)]$ exists, there exists a positive and finite constant $c$ such that $$ \lim_{n\to\infty} \frac{E[g(X_n)]}{g(E[X_n])} \to c. $$

I don't know whether it is true or not. I tested some known random variables and I failed to find counter examples.

Here is one example: I select $g(x) = 1/x$ and $X_n\sim Gamma(2,n/2)$, where $Gamma(k,\theta)$ denotes the Gamma random variable of which density function is $$ f(x)=\frac{1}{\Gamma(k)\theta^k}x^{k-1}e^{x/\theta}. $$

After some manipulation, I obtain $E[X_n] = n$ and $E[1/X_n] = 2/n $ so that the above statement is true with $c=2$.

2

There are 2 best solutions below

0
On BEST ANSWER

Another simple recipe for counterxamples: assuming $g(x)$ is differentiable we get:

$$E[g(X)]=g(\mu) + \frac{g^{''}(\mu)}{2!}m_2+ \frac{g^{'''}(\mu)}{3!} m_3+\cdots$$

where $\mu=E(X)$ and $m_i$ are the centered moments. Then, by choosing $g(x)=e^x$ we get

$$\frac{E[g(X)]}{g(E[X])}=1 + \frac{m_2}{2!}+ \frac{m_3}{3!} +\cdots$$

For example, let $X_n$ uniform on $[0,n]$ and $g(x)=e^x$

2
On

The claim is false. Define $X_n$ to be $$ X_n=\begin{cases} \delta_n&\text{with probability $1-\epsilon_n$}\\ A_n&\text{with probability $\epsilon_n$} \end{cases} $$ where $\epsilon_n$ and $\delta_n$ are sequences of positive numbers tending to $0$, while $A_n$ is a sequence tending to $\infty$ such that $A_n\epsilon_n\to\infty$ (so that $E(X_n)\to\infty$). With appropriate choices of these sequences you can find $g(x)$ such that the limit in question is zero, or infinity.

Example: If $A_n=n^2$ and $\epsilon_n=\delta_n=\frac1n$, the limit is zero for $g(x):=x/(1+x)$; the limit is $\infty$ for $g(x):=x^2$.

The problem is that the requirement $Eg(X_n)<\infty$ is trivial to satisfy for these $X_n$; moreover, there are too many choices of $g$.