Stochastic decision problem with normal distribution

165 Views Asked by At

Suppose the decision maker receives a piece of information (signal) $s=\theta+e$, where the true parameter $\theta$ and error $r$ are normally distributed, and makes decision $d\ge 0$ in order to maximize $$-Pr(\theta<T-d|s)W-d,$$ with $W>0$ and $T\in\mathbb{R}$ being scalars.

Suppose the optimal decision is expressed as function $d(s)$: $$d(s):=\arg\max_{d\ge 0} -Pr(\theta<T-d|s)W-d.$$ Then here is my problem: Is the term $s+d(s)$ always (i.e., for all $W,T$) strictly increasing, and how to show this?

The problem is nasty because the optimization problem is not globally concave in $d$, so the first order condition does not always determine the optimum.

What can I show? Well, if $W$ is small enough, then the corner solution is $d=0$ for all $s$, and then $s+d(s)=s$ is always invertible. However, if $W$ is large enough, then for some interval an interior solution $d(s)>0$ holds. I plotted an example of $d(s)$ in such a case: enter image description here

The downward sloping part is the interval where the first order condition holds $$1=-\frac{\partial Pr(\theta<T-d|s)}{\partial d}W.$$ I was able to get an explicit expression for this interior $d(s)$: $$d(s)=T-PDF^{-1}(1/W)$$ $$=T+\sqrt{-2\sigma^2 \log(1/W \sqrt{2\pi\sigma^2})}-\mu(s),$$ where $PDF^{-1}$ is the inverse of the left side of the normal probability density function, $\sigma^2$ is the variance and $\mu(s)$ is the mean of the posterior normal distribution which depends on information $s$.

Because $s$ enters only in the mean, the slope of $d(s)$ is greater than $-1$, so $s+d(s)$ is always strictly increasing whenever $d(s)$ is determined by the first order condition.

But perhaps it could still be that there is a discontinuity when $d(s)>0$ switches from the interior solution to a corner solution $d(s)=0$ (around $s=9$ in the plot) for some parameter values $W,T$, which would then introduce a downward jump in $s+d(s)$, making it non-monotone. This is obviously not the case in the plot, but can this happen, or if not, how to show it cannot happen (so that $s+d(s)$ really is always strictly increasing)?

I would appreciate any help, either with another approach or by finishing that (hopefully) last step in my approach. Thanks!