I was wondering if it is possible to prove the following (or show false otherwise). Given two independently distributed random variables
$a\sim \mathcal{N}(\alpha,\sigma_\alpha^2)$ $d\sim \mathcal{N}(\delta,\sigma_\delta^2)$
Want to show:
$Pr(a>0|a+d=\pi)$ is increasing in $\pi$. In other words, the conditional probability for $a>0$ is increasing with $a+d$.
Thanks!
Hint: $\Pr (a>0\mid a+d=\pi) = \dfrac{\displaystyle\int_0^\infty f_a(u)\;f_d(\pi-u)\operatorname d u}{\displaystyle\int_{-\infty}^\infty f_a(u)\;f_d(\pi-u)\operatorname d u}$