I'm writing a statistical model in which I want to apply a convex penalty to a model-derived probability, call it p. There is also a known constant, c. For the solution to the statistical model to have desirable properties, I would like this penalty function, F(p), to have the following properties:
F(0)= $\infty$
F(1)= $\infty$
F(p) > F(c) $\forall p \neq c$.
F is convex in p
Thus, the penalty is minimized when p equals the known constant c. So far, the best I have been able to come up with is
$$ F(p) = \frac{1}{p} + \frac{1}{1-p}$$
but this, of course, does not have a minimum at c. Note that I only need the function to be defined for $p \in [0,1]$. The simpler the function the better. If it involves logarithms or exponentials, that is especially nice because of the ease of computing derivatives with respect to p.