In my lectures it has been discussed that for $x>\frac{1}{2}$ when $X_{1},X_{2},...$ are iid Bernoulli random variables with parameter $\frac{1}{2}$ that the following holds. \begin{equation} \lim_{n\rightarrow\infty}\frac{1}{n}\log\mathbb{P}(S_{n}\geq nx)=-I(x), \end{equation} where \begin{equation} I(x)= \begin{cases} \log(2)+x\log x+(1-x)\log(1-x) \text{ if }x\in[0,1] \\ \infty \text{ otherwise}. \end{cases} \end{equation}
Now I was wondering what the similar situation would be for arbitrary parameter $p$ instead of $\frac{1}{2}$. I have found online that for $x>p$ it holds similairly but then $I(x)$ is defined differently, namely as, \begin{equation} I(x)= \begin{cases} x\log\frac{x}{p}+(1-x)\log(\frac{1-x}{1-p}) \text{ if }x\in[0,1] \\ \infty \text{ otherwise}. \end{cases} \end{equation} I cannot seem to find the proof for this so I attempted to do it myself. However, using the similar method as I found for the $p=\frac{1}{2}$ case (which was explained in my lecture) I get a result of \begin{equation} \lim_{n\rightarrow\infty}\frac{1}{n}\log\mathbb{P}(S_{n}\geq nx)=\log(p)-x\log(x)-(1-x)\log(1-x) \end{equation} which is not the same result (I graphed it for different $p\in(0,1)$). I did see that sometimes problems like this would be approached with Cramer's theorem but I believe that this will only be discussed in my next lecture. Is it possible to do it without?