if I have a random variable D, and a function G(x,D), then why when we define g(x) = E[G(x,D)] this holds: $g(x) = g(0) + \int_0^x g'(z) dz$

55 Views Asked by At

I am reading this tutorial on stochastic programming by Shapiro and Philpott, and there is something in it that I don't understand.

There is a random variable $D$.

There is a function $G(x,D) = cx + b\cdot max\{D-x,0\} + h\cdot max\{x-D,0\}$

Let $F(Z)$ be the cdf of demand D i.e., $F(z):=Prob(D\le z)$.

In the appendix (6.1), they show how to infer that $E[G(x,D)] = bE[D] + (c-b)x + (b+h)\int_0^xF(z)\, dz$.

They say that if $g(x) = E[G(x,D)]$, then for $x\ge 0$ we have:

$$g(x) = g(0) + \int_0^xg'(z)\, dz$$

Why is that?

1

There are 1 best solutions below

3
On

That is by one of the fundamental theorems of Calculus.

In the right hand side of $g(x) = g(0) + \int_0^x g'(z)\, dz$, what you are doing is continuously summing up $\Delta y$ (i.e., $g(x+\Delta x) - g(x)$) from $0$ to $x$ and adding that to $g(0)$ to get $g(x)$ on the left hand side.