Covariance between functions of random variables

555 Views Asked by At

Suppose $Y,X_1,X_2,...$ are random variables (not necessarily i.i.d.) with expectation $0$ and variance $1$, and suppose that $$Cov(X_n,Y)<\frac{1}{n} $$ for every $n \in \mathbb{N}$. Given a measurable function $f:\mathbb{R}\longrightarrow \mathbb{R}$ with $||f||_{\infty}\le 1$, can we bound $$Cov(f(X_n),Y) $$ other then the trivial bounds $$|Cov(f(X_n),Y)|\le \sqrt{Var(f(X_n))\cdot Var(Y)}? $$

1

There are 1 best solutions below

0
On

In general I think not. Consider $X$ and $Y$ whose joint distribution is given by the following table with $x+y+z=1$, $0<a<b$ and $r$ a given covariance between $X$ and $Y$:

\begin{array}{l l | c c c } &&&Y& \\ \hline & P(X=x,Y=y) & -c & 0 & c \\ \hline & b & 0 & x/2 & 0 \\ \hline X & a & z/2 & 0 & y/2 \\ \hline & -a & y/2 & 0 & z/2 \\ \hline & -b & 0 & x/2 & 0 \\ \end{array}

Take $z=0$, $x=\epsilon$, and $y=1-\epsilon$. It is straightforward to show that with $c=\sqrt{[1/(1-\epsilon)]}$ and $a=r/[cy]$, $Cov(X,Y)=r$, and $Var(X)=Var(Y)=1$, where $Var(X)=1$ is achieved by taking $b$ appropriately. Let $f(x)=(1/a)x$ for $x\in [-a,a]$, and $f(x)=0$ otherwise. $\|f\|_\infty=1$. It follows that $Cov(f(X),Y)=c(y-z)=(1-\epsilon)\sqrt{[1/(1-\epsilon)]}$, which doesn't depend on $r$ at all, and can be taken to be arbitrarily close to 1. 1 is clearly the upperbound in this example by Cauchy-Schwarz.