In Hull (2008, p. 307), the following equation is found (Eq. 13A.2):
$$E[\max(V-K,0)]=\int_{K}^{\infty} (V-K)g(V)\:dV$$
Where $g(V)$ is the PDF of $V$, $K$ is a constant, and both $V,K>0$. He then goes on to show that if $V$ is lognormally distributed, then
$$E[\max(V-K,0)]=VN(d_1)-KN(d_2)$$
where $N(x)$ is the standard normal distribution, $d_1=\frac{\ln(V/K)}{w}+\frac{w}{2}$ and $d_2=d_1-w$, and $w$ is the standard deviation of $\ln(V)$.
(You might recognize this as being very similar to the Black-Scholes equation.)
I would like to find a similar expression for $$E[\max(V-K,Y)]$$
For three cases:
1) where $Y$ is a positive constant
2) where $Y$ is an positive independent random variable
3) where $Y$ is a positive random variable conditional upon $V$
In an earlier post, I was told that, generally speaking, for two positive independent random variables X and Y, with CDFs $F(x)$ and $G(y)$, $$E[\max(X,Y)]=\int_0^{\infty}wF(w)g(w)dw+\int_0^{\infty}wf(w)G(w)dw$$
This was informative but I have been unable to use it to find the kind of "Black-Scholes-esque" expression I am looking for.
in the positive constant case $$ \max(X-V,Y) = Y + \max(X-V-Y,0) $$ so it's immediate.
In the other cases, you integrate the formula for constant $Y$ over the density of $Y.$