Let X be a finite set and A be a set of probability distributions on X.
Then KL-Divergence between two probability distributions P(x) and Q(x) $\in$ A is $$D(P(x)\vert \vert Q(x))=\sum P(x)Log(P(x)/Q(x))$$ for all x$\in$ X.
This KL-Divergence is a strictly convex function in P for a fixed distribution Q(x).
If A is a closed, convex set then the I-projection of Q(x) on A is $$P^*= Arg(minD(P(x)\vert \vert Q(x))$$
Is $P^*$ always exit? If yes, then how can we prove this mathematically?
PS: I was reading "Information Theory and Statistics:A Tutorial by Csizer and Shield".In chapter number 3: I-Projection, it is written that since Kl-Divergence is a continuous and strictly convex function on a closed convex set A therefore $P^*$ is exist and is unique.