How to express $E[\max(x,y)]$ as an integral?

5.1k Views Asked by At

In Hull (2008, p. 307), the following equation is found (Eq. 13A.2):

$$E[\max(V-K,0)]=\int_{K}^{\infty} (V-K)g(V)\:dV$$

Where $g(V)$ is the PDF of $V$ and $V,K>0$.

I'd like to extrapolate from this to find a general expression for $E[\max(x,y)]$ where $x,y>0$.

I guess it would be:

$$E[\max(x,y)]=\int_{0}^{\infty} x g(x)\:dx+\int_{0}^{\infty} y f(y)\:dy$$

Where $g(x)$ and $f(y)$ are the respective PDFs of variables $x$ and $y$. Is that right?

3

There are 3 best solutions below

4
On BEST ANSWER

If $X$ and $Y$ are random variables defined on a probability space $(\Omega,\mathcal F,\mathbb P)$, it is not hard to prove that $W:=\max\{X,Y\}$ is also a random variable. Then by definition, $$\mathbb E[W] = \int_\Omega W\ \mathsf d\mathbb P.$$

By the "law of the unconscious statistician" we can compute this expectation as: $$\mathbb E[W] = \int_{\mathbb R^2} \max\{x,y\}\ \mathsf dF(x,y),$$ where $F$ is the joint distribution function of $(X,Y)$. If we assume that $X$ and $Y$ are independent, we can find the distribution of $W$ explicitly. The event $\{W\leqslant w\}$ is equivalent to $\{X\leqslant w\}\cap\{Y\leqslant w\}$, and so $$\mathbb P\{W\leqslant w\}=\mathbb P\{X\leqslant w\cap Y\leqslant w\}=\mathbb P\{X\leqslant w\}\mathbb P\{Y\leqslant w\}. $$ Hence the distribution function of $W$ is $G := F_XF_Y$ where $F_X$ and $F_Y$ are the distribution functions of $X$ and $Y$, respectively. It follows then that $$\mathbb E[W] = \int_{\mathbb R}w\ \mathsf d(F_X(w)F_Y(w)). $$ If $X$ and $Y$ are continuous random variables, then this reduces to $$\mathbb E[W] = \int_{\mathbb R}w(f_X(w)F_Y(w) + F_X(w)f_Y(w))\ \mathsf dw, $$ where $f_X$ and $f_Y$ are probability density functions of $X$ and $Y$, respectively.

0
On

Let $X$ and $Y$ be (say positive) independent random variables, with continuous distributions, densities $f(x)$ and $g(y)$, and cumulative distribution functions $F(x)$ and $G(y)$. Let $W=\max(X,Y)$. Then $\Pr(W\le w)=\Pr(X\le w)\Pr(Y\le w)=F(w)G(w)$.

Thus the cumulative distribution function of $W$ is $F(w)G(w)$, and therefore by the Product Rule the density function of $W$ is $$F(w)g(w)+G(w)f(w),$$ and therefore $$E(W)=\int_0^\infty wF(w)g(w)\,dw+\int_0^\infty wG(w)f(w)\,dw.$$

Remark: The above expression can be manipulated in various ways, but your expression is not one of them. For let $X$ and $Y$ be independent with uniform distribution on $(0,1)$. Then $E(W)=\frac{2}{3}$, while your expression gives $1$ (which in any case cannot be correct).

1
On

Let $Z=\max(X,Y)$ and $f(x,y)$ be the joint PDF of $(X,Y)$ (not assuming independence). Then, \begin{align*} Pr[Z<z] &= Pr[\max(X,Y)<z] \\ &= Pr[X<z \text{ and } Y<z] \\ &= \int_{-\infty}^z \int_{-\infty}^z f(x,y)\,dx\,dy \\ &= F(z) \end{align*}

With the CDF function $F(z)$, we may find the PDF $f(z)=F'(z)$ by Leibniz rule: $$ f(z) = \int_{-\infty}^z f(z,y)\,dy + \int_{-\infty}^z f(x,z)\,dx = \int_{-\infty}^z [f(t,z)+f(z,t)]\,dt. $$

Finally, we may now compute the expectation of $Z$: $$ \mathbb{E}[Z] = \int_{-\infty}^\infty z \int_{-\infty}^z [f(t,z)+f(z,t)]\,dt\,dz = \int_{-\infty}^\infty \int_{t}^\infty z[f(t,z)+f(z,t)]\,dz\,dt. $$

(Note: I just noticed you said your variables were positive. Just replace $-\infty$ with $0$ and the result holds.)