covariance between random variables for identically distributed data 2

67 Views Asked by At

Consider the probability space $(\Omega, \mathcal A, P)$. Let $\{Z_i\}_{i\in\mathbb N}$ be a sequence of nonnegative random variables identically distributed as $Z$, possibly dependent. Moreover, define $f(x)=1_{[0,1]}(x)x$.

I want to show that $Cov (f(Z_i),f(Z_j))\neq Cov(f(Z_1),f(Z_2)), \forall i\neq j$.

Comments Note that $Cov (f(Z_i),f(Z_j))= Cov(f(Z_1),f(Z_2))$ is equivalent to $E(f(Z_i)f(Z_j))=E(f(Z_1)f(Z_2))$. Rewriting the latter equality, $$\int_{[0,1]^2}uv \ dP_{ij}(u,v)=\int_{[0,1]^2}uv \ dP_{12}(u,v), \ \forall i\neq j,$$ where $P_{ij}$ is the probability distribution induced by the pair $(Z_i,Z_j)$, for all $i,j$, and is defined as $P_{ij}(E)=P((Z_i,Z_j)^{-1}(E))=P(\{\omega\in\Omega:(Z_i(\omega),Z_j(\omega))\in E\})$ for all Borel-measurable set $E\in\mathcal B(\mathbb R^2)$. Then if $\exists (i,j): P_{ij}\neq P_{1,2}$, the above equality doesn't need to hold.

But, as far as I know, the (joint) probability distributions $P_{ij}$ are not the same for dependent data, they vary according to $i,j$, in general.

Question Can you give me a feedback if my ideas are correct or indicate me how can I proceed to show the result?

1

There are 1 best solutions below

0
On

Here is a confirming example.

Let $C,D,B_{1},B_{2,}\dots$ denote independent Bernoulli rv's where $p$ denotes the parameter of $C$ and $D$ and $p_{i}$ denotes the parameter of $B_{i}$.

Setting $Z_{i}=B_{i}C+\left(1-B_{i}\right)D$ also the $Z_{i}$ are Bernoulli rv's with parameter $p$.

Further $f\left(Z_{i}\right)=Z_{i}$ and $\mathbb{E}Z_{i}Z_{j}=\left(p_{i}p_{j}+\left(1-p_{i}\right)\left(1-p_{j}\right)\right)p$.

It is easy to find a sequence $(p_{i})_i$ such these expressions are different for every ordered pair $\left(i,j\right)$.