In simple setting consider revenue is dependent on variable $w$ which is uniformly distributed $[0,1]$. The revenue function is $wd$, where $d$ is development program. How to I get the expected revenue in such instances where $w$ can take any value from $0$ to $1$ with equal probability.
This might be a simple problem but I am really weak at maths so help will be appreciated.
thanks for the reply. So my actual problem was: Principal is interested in buying lake. The revenues of operating the lake as a camping site are determined by the weather (a random parameter $w$ distributed uniformly on $[0, 1]$) and by the development programme $d$. The revenue function is $wd$, the cost of development is $(d^2-1)/2$ and the agent outside option is $x>0$. Principal needs to specify a contract paying the agent an amount $π$ (if the site produces revenue $R$, the principal gets $R-π$ and the agent get $π$). Suppose that the agent have to accept or reject the contract before observing $w$, but choose the development programme after observing it. $π=aR$.
Now if I solve this by your method will it be correct that principal expected payoff will be $d/2-π$ and agent expected payoff will be $ad/2-(d^2-1)/2$
You need to know what is the expected value of the uniformly distributed variable $w$. Let me denote the expected value by $E$, then $E(w)$ is the expected value of $w$, and in fact $E(w) = \frac{1}{2}$. This is perhaps not surprising, as $\frac{1}{2}$ is the midpoint of the interval $[0,1]$.
What you really want is $E(wd)$. However, $d$ is not random, so you can do the calculation as $$E(wd) = dE(w) = \frac{d}{2}.$$
This is the expected revenue.