Consider a random variable $\phi$. Assume $\mathbb{E}[\phi]<\infty$. Let $F_\phi(t)$ be its cumulative distribution function. Let $(a)^+=\max\{0,a\}$.
I am trying to understand the proof of the following result $$\mathbb{E}[(\phi-z)^+]=\int_z^\infty(1-F_\phi(t))dt$$ It proceeds like this $$ \mathbb{E}[(\phi-z)^+]=\int_z^\infty(t-z)dF_\phi(t)=\lim_{y\to\infty}\int_z^y(t-z)dF_\phi(t) $$ Now, focusing on the argument of the limit, and using integration by parts we get $$ \int_z^y(t-z)dF_\phi(t)=(t-z)F_\phi(t)\vert^y_z-\int_z^\infty F_\phi(t)dt$$ So far so good. But the next step confuses me $$(t-z)F_\phi(t)\vert^y_z-\int_z^\infty F_\phi(t)dt=-(y-z)(1-F_\phi(y))+\int_z^\infty (1-F_\phi(t))dt$$
I fail to understand why the last equality holds. For the first term, I would expect $$(y-z)F_\phi(y)-(z-z)F_\phi(z)=(y-z)F_\phi(y)$$ but instead a $1-F_\phi$ shows up there and in the integral. My guess is that it is probably some simple equivalence regarding $F_\phi(t)$ and $1-F_\phi(t)$ but I cannot figure it out. Any hints?
When you apply integration by parts, you are integrating $dv\equiv dF_\phi(t)$. It is common to choose the integration constant to be $0$, but in a general fashion, $$\int dF_\phi(t)dt=F_\phi(t)+c,$$ with $c\in\mathbb{R}$ arbitrary. Then, a way to obtain the equation that you are looking for is to take the antiderivative with $c=-1$. Note that the result of integration by parts does not depend on the candidate you chose from the family of antiderivatives.
Edit: Let us show that this does not depend on $c$. Integration by parts states that $$\int_a^b udv=uv|_a^b-\int_a^bvdu. $$ If we take $u=t-z$ and $dv = dF_\phi(t),$ $du=dt$ and $v=F_\phi(t)+c.$ Therefore,
$$\int_z^y (t-z)dF_\phi(t) = (t-z)[F_\phi(t)+c]\Big|_z^y - \int_z^y \left(F_\phi(t)+c\right)dt \\ =(y-z)\left[F_\phi(y)+c\right]-\int_z^y F_\phi(t)dt - c(y-z)\\ = (y-z)F_\phi(y)-\int_z^y F_\phi(t)dt,$$ which is independent of $c$.