Covariance of sum and maximum

181 Views Asked by At

I have the task :)

$X_1, X_2$ are independent and have uniform distribution on $(0,1).$ Calculate $\operatorname{Cov}(X_1+X_2,\max(X_1,X_2))$.

I did it in this way. The distriburion of $\max(X_1,X_2)$ is $P(\max(X_1,X_2)=x)=2x$ on $(0,1)$. In this way we have: $E(X_1+X_2)\cdot E\max(X_1,X_2)=1 \cdot \frac{2}{3}$

\begin{align} & E((X_1+X_2) \cdot \max(X_1,X_2))=2 E(X_1\cdot \max(X_1,X_2)) \\[6pt] = {} &2 \cdot \int_0^1 E(t \cdot \max(t,X_2))\cdot f_{X_1}(t) \,dt=2\cdot \int_0^1 t \cdot \frac{t+1}{2} \, dt=\frac{5}{6} \end{align}

So the covariance is equal $\frac{1}{6}$

But I have the correct answer to this task and it is $\frac{1}{12}$

Where did I mistake?

Thanks in advance.

4

There are 4 best solutions below

1
On BEST ANSWER

In fact $\Pr(\max\{X_1,X_2\}=x) = 0.$ I assume you must have meant that the value of the probability density function of $\max\{X_1,X_2\}$ at $x$ is $2x.$

$$ \operatorname E(\max\{t,X_2\}) = \operatorname E(\operatorname E(\max\{t,X_2\} \mid \mathbf 1[X_2>t])) $$ where $\mathbf 1[X_2>t] = 1$ or $0$ according as $X_2>t$ or not. $$ \operatorname E(\max\{t,X_2\} \mid \mathbf 1[X_2>t]) = \begin{cases} t & \text{if } X_2\le t, \\ (1+t)/2 & \text{if } X_2 > t. \end{cases} $$ And the expected value of that is \begin{align} & t\cdot\Pr(X_2\le t) + \frac{1+t} 2\cdot\Pr(X_2>t) \\[8pt] = {} & t^2 + \frac{1+t} 2\cdot(1-t) = \frac{1+t^2} 2 . \end{align}

0
On

I am not sure of your logic for calculating $\operatorname E\left[X_1\max(X_1,X_2)\right]$.

By definition, this is equal to

\begin{align} \operatorname E\left[X_1\max(X_1,X_2)\right]&=\iint x\max(x,y)f_{X_1,X_2}(x,y)\,\mathrm dx\,\mathrm dy \\&=\iint x\max(x,y)\mathbf1_{0<x,y<1}\,\mathrm dx\,\mathrm dy \\&=\iint x^2\mathbf1_{0<y<x<1}\,\mathrm dx\,\mathrm dy+\iint xy\,\mathbf1_{0<x<y<1}\,\mathrm dx\,\mathrm dy \\&=\int_0^1\int_y^1 x^2\,\mathrm dx\,\mathrm dy+\int_0^1 y\int_0^y x\,\mathrm dx\,\mathrm dy \end{align}

0
On

Set $X:=\max(X_1,X_2)$. Notice from symmetry $$\operatorname{cov}(X_1+X_2,X)=\operatorname{cov}(X_1,X)+\operatorname{cov}(X_2,X)=2\operatorname{cov}(X_1,X)$$ Let's take a closer look at $\operatorname{cov}(X_1,X)$. First notice $E(X_1)=\frac{1}{2}$ and $$E(X)=\int_0^1xf_X(x)\,dx=\int_0^12x^2\,dx=\frac{2}{3}$$ Therefore

$$\operatorname{cov}(X_1,X)=E(X_1X)-E(X_1)E(X)=E(X_1X)-\frac{1}{3}$$ From the total law of expectation, $$E(X_1X)=E(X_1X\mid X_1 \leq X_2)P(X_1 \leq X_2)+E(X_1X\mid X_1>X_2)P(X_1>X_2)$$ Notice $P(X_1 \leq X_2)=P(X_1>X_2)=\frac{1}{2}$ and $$E(X_1X\mid X_1 \leq X_2)=E(X_1X_2\mid X_1 \leq X_2)=\int_0^1\int_{x_1}^1\frac{x_1x_2}{P(X_1 \leq X_2)}\,dx_2\,dx_1=\frac{1}{4}$$ On the other hand, $$E(X_1X\mid X_1 > X_2)=E(X_1^2\mid X_1 > X_2) = \int_0^1 \int_{x_2}^1 \frac{x_1^2}{P(X_1 > X_2)}\,dx_1\,dx_2=\frac{1}{2}$$ We get that $E(X_1X)=\frac{1}{2}\big[\frac{1}{4}+\frac{1}{2}\big]=\frac{3}{8}$ which means $\operatorname{cov}(X_1,X)=\frac{1}{24}$ and finally $$\operatorname{cov}(X_1+X_2,X)=\frac{1}{12}$$

0
On

A geometric approach (considering only the half square $0 \le X_1 \le X_2 \le 1$ because of symmetry)

Unif_max&sum_1

clearly shows that the joint pdf is $$ p(m,s) = 2\left[ {m \le s \le 2m} \right] $$ where $[P]$ denotes the Iverson bracket and which in fact gives $$ \eqalign{ & \int_{m = 0}^1 {\int_{s = 0}^2 {p(m,s)\,dm\,ds} } = 2\int_{m = 0}^1 {\int_{s = m}^{2m} {\,dm\,ds} } = \cr & = 2\int_{m = 0}^1 {mdm} = 1 \cr} $$

Then $$ \eqalign{ & \overline m = 2\int_{m = 0}^1 {m^{\,2} dm} = {2 \over 3} \cr & \overline s = 2\int_{m = 0}^1 {\int_{s = m}^{2m} {\,dm\,sds} } = 3\int_{m = 0}^1 {m^{\,2} dm} = 1 \cr} $$ and $$ \eqalign{ & 2\int_{m = 0}^1 {\int_{s = m}^{2m} {\,\left( {m - 2/3} \right)\left( {s - 1} \right)dm\,ds} } = \cr & = 2\int_{m = 0}^1 {\left( {m - 2/3} \right)dm\int_{s = m - 1}^{2m - 1} {\,s\,ds} } = \cr & = \int_{m = 0}^1 {\left( {m - 2/3} \right)\left( {3m^{\,2} - 2m} \right)dm} = \cr & = \int_{m = 0}^1 {\left( {3m^{\,3} - 4m^{\,2} + 4/3m} \right)dm} = \cr & = {3 \over 4} - {4 \over 3} + {4 \over 6} = {1 \over {12}} \cr} $$