Inference of parameters given their relation to expected value

17 Views Asked by At

I have a pdf, $p(x; a,b)$ such that the expected value is dependent and strictly increasing on two parameters, $a, b$.

$x$ is a real number, while $a$ and $b$ each belong to $[-1, 1]$.

I want to find the truth of the following:

$P(a \geq 0 | b \geq 0, m = 0) \leq 0.5$

The priors for the parameters are as follows:

$p(a=x)=0.5, x \in [-1, 1], 0$ otherwise.

$p(b=y)=0.5, y \in [-1, 1], 0$ otherwise.

What I have done so far

$P(a \geq 0 | b \geq 0, m = 0)$

$= \frac {P(b \geq 0, m=0 | a\geq 0) P(a \geq 0)} {P(b \geq 0, m=0)}$

$= \frac {P(m=0|a \geq 0, b \geq 0) P(b \geq 0| a \geq 0)P(a \geq 0)} {P(m = 0 | b\geq 0)P(b\geq 0)}$

Because the prior probability distribution of $b$ is independent of $a$, and $P(a \geq 0)=0.5$, I simplify this expression to

$\frac {P(m=0|a \geq 0, b \geq 0)} {P(m = 0 | b\geq 0)} × 0.5$

Thus, the expression is less than or equal to $0.5$ if and only if

$P(m=0|a \geq 0, b \geq 0) \leq P(m = 0 | b\geq 0)$

This condition seems true from a common sense perspective. Increasing $a,b$ leads to an increase in the expected value of $m$, so it seems that having both $a$ and $b$ non-negative would reduce the probability of $m$ being equal to $0$, compared to when we only know $b \geq 0$.

But common sense is often not a good guide to probability.

So I wanted to know- is the condition actually mathematically guaranteed, given the conditions on the expected value? If yes, how to prove that? If no, what can be a pdf of $m$, perhaps as a function of $a,b$, where the condition is not met? Or under what conditions does the claim hold?