In Hull (2008, p. 307), the following equation is found (Eq. 13A.2):
$$E[\max(V-K,0)]=\int_{K}^{\infty} (V-K)g(V)\:dV$$
Where $g(V)$ is the PDF of $V$ and $V,K>0$.
I'd like to extrapolate from this to find a general expression for $E[\max(x,y)]$ where $x,y>0$.
I guess it would be:
$$E[\max(x,y)]=\int_{0}^{\infty} x g(x)\:dx+\int_{0}^{\infty} y f(y)\:dy$$
Where $g(x)$ and $f(y)$ are the respective PDFs of variables $x$ and $y$. Is that right?
If $X$ and $Y$ are random variables defined on a probability space $(\Omega,\mathcal F,\mathbb P)$, it is not hard to prove that $W:=\max\{X,Y\}$ is also a random variable. Then by definition, $$\mathbb E[W] = \int_\Omega W\ \mathsf d\mathbb P.$$
By the "law of the unconscious statistician" we can compute this expectation as: $$\mathbb E[W] = \int_{\mathbb R^2} \max\{x,y\}\ \mathsf dF(x,y),$$ where $F$ is the joint distribution function of $(X,Y)$. If we assume that $X$ and $Y$ are independent, we can find the distribution of $W$ explicitly. The event $\{W\leqslant w\}$ is equivalent to $\{X\leqslant w\}\cap\{Y\leqslant w\}$, and so $$\mathbb P\{W\leqslant w\}=\mathbb P\{X\leqslant w\cap Y\leqslant w\}=\mathbb P\{X\leqslant w\}\mathbb P\{Y\leqslant w\}. $$ Hence the distribution function of $W$ is $G := F_XF_Y$ where $F_X$ and $F_Y$ are the distribution functions of $X$ and $Y$, respectively. It follows then that $$\mathbb E[W] = \int_{\mathbb R}w\ \mathsf d(F_X(w)F_Y(w)). $$ If $X$ and $Y$ are continuous random variables, then this reduces to $$\mathbb E[W] = \int_{\mathbb R}w(f_X(w)F_Y(w) + F_X(w)f_Y(w))\ \mathsf dw, $$ where $f_X$ and $f_Y$ are probability density functions of $X$ and $Y$, respectively.