For discrete probability distributions $P$ and $Q$ defined on the same probability space, $\mathcal{X}$, the relative entropy from $Q$ to $P$ is defined to be $$ D_{\mathrm{KL}}(P \| Q)=\sum_{x \in \mathcal{X}} P(x) \log \left(\frac{P(x)}{Q(x)}\right) $$ Assume my two distributions are $P=[p, 0, \ldots,0, 1-p]$ and $Q=binomial(n,q)=\left(\begin{array}{l} n \\ k \end{array}\right) q^{k}(1-q)^{n-k}, \forall k \in \{0,1,2,...n\}$.
Both distributions have $n+1$ mass points. (The non-extreme probabilities are $0$ for $P$).
Can someone help me find the parameter $q$ which minimize $ D_{\mathrm{KL}}(P \| Q)$?
Here your probability space is $\mathcal X = \{0,1,\ldots,n\}$, so with the given distributions $P$ and $Q$ you Kullback-Leibler divergence simply rewrites $$\begin{align}D_{\mathrm{KL}}(P \| Q)&=\sum_{x \in \mathcal{X}} P(x) \log \left(\frac{P(x)}{Q(x)}\right)\\ &=\sum_{k \in \{0,1,\ldots,n\}} P(k) \log \left(\frac{P(k)}{Q(k)}\right)\\ &= p\log \left(\frac{p}{(1-q)^n}\right) + 0 +(1-p)\log \left(\frac{1-p}{q^n}\right) \end{align} $$
Assuming $p=1-q$, this further simplifies to $$\begin{align}D_{\mathrm{KL}}(P \| Q)&= (1-q)\log \left(\frac{1}{(1-q)^{n-1}}\right) +q\log \left(\frac{1}{q^{n-1}}\right) \\ &= (n-1) \cdot\left((1-q)\log\left(\frac{1}{1-q}\right) +q\log \left(\frac{1}{q}\right)\right)\end{align}$$
So finding the $q$ that minimizes $D_{\mathrm{KL}}(P \| Q)$ is equivalent to finding the $q$ that minimizes $$ (1-q)\log\left(\frac{1}{1-q}\right) +q\log \left(\frac{1}{q}\right) $$
Which you should be able to find.