Function to bound a probability away from 0 and 1

51 Views Asked by At

I'm writing a statistical model in which I want to apply a convex penalty to a model-derived probability, call it p. There is also a known constant, c. For the solution to the statistical model to have desirable properties, I would like this penalty function, F(p), to have the following properties:

  1. F(0)= $\infty$

  2. F(1)= $\infty$

  3. F(p) > F(c) $\forall p \neq c$.

  4. F is convex in p

Thus, the penalty is minimized when p equals the known constant c. So far, the best I have been able to come up with is

$$ F(p) = \frac{1}{p} + \frac{1}{1-p}$$

but this, of course, does not have a minimum at c. Note that I only need the function to be defined for $p \in [0,1]$. The simpler the function the better. If it involves logarithms or exponentials, that is especially nice because of the ease of computing derivatives with respect to p.