In the book "Large deviations" by Frank den Hollander, one reads in pg 30:
Exercise III.10 (Suggested by G. O'Brien.) Let $Z_n$ be a single random variable with a binomial distribution with parameters $n$ and $p_n$. Let $P_n(\cdot)=P(Z_n/np_n\in\cdot)$. Show that if $\lim\limits_{n\to\infty}p_n=0$ and $\lim\limits_{n\to\infty}np_n=\infty$, then $(P_n)$ satisfies the LDP on $\mathbb{R}$ with rate $np_n$ and with rate function given by $I(z)=z\log z-z+1$, $z\geq0$, and $I(z)=\infty$, $z<0$. Does the answer ring a bell? (Recall Exercise I.11.)
So this exercise motivated the following computations:
Let $Z_n = \sum_{i=1}^n X_{i,n}$ where $X_{i,n} \sim B(p_n)$ are independent bernoulli distribution with parameter $p_n$. Let $a = 1 + \delta$ with $\delta >0$ and let $\theta>0$ be an arbitrary positive number $$P\bigg( \frac{Z_n}{np_n}>a\bigg) = P\bigg( \frac{Z_n - np}{np_n}>\delta\bigg) = P\bigg( \exp\bigg\{\theta\frac{Z_n - np}{np_n}\bigg\}>e^{\theta\delta}\bigg) \\ \leq \frac{1}{e^{\theta \delta}} \Bbb{E}\bigg( \exp\bigg\{\theta\frac{Z_n - np}{np_n}\bigg\}\bigg) $$ Where the last inequality folloes from Markov inequality.
Now we compute
$$e^\theta\Bbb{E}\bigg( \exp\bigg\{\theta\frac{Z_n - np}{np_n}\bigg\}\bigg) = \Bbb{E}\bigg( \exp\bigg\{\theta\frac{Z_n}{np_n}\bigg\}\bigg) \\ =\Pi_{i = 1}^n \Bbb{E}\bigg( \exp\bigg\{\theta\frac{X_{i,n}}{np_n}\bigg\}\bigg) =\Pi_{i = 1}^n \bigg( p_ne^{\theta\frac{1}{np_n}} + (1 - p_n)\bigg)\\ = (1 + p_n(e^{\theta/np_n} - 1)^n = \Bigg(1 + p_n\bigg[\frac{\theta}{np_n} + O\bigg(\frac{1}{n^2p_n^2}\bigg)\bigg]\Bigg)^n \\ = \exp\bigg\{n\log\Bigg(1 + p_n\bigg[\frac{\theta}{np_n} + O\bigg(\frac{1}{n^2p_n^2}\bigg)\bigg]\Bigg)\bigg\}\\ = \exp\bigg\{n\Bigg(\bigg[\frac{\theta}{n} + O\bigg(\frac{1}{n^2p_n}\bigg)\bigg]\Bigg)\bigg\}\to e^\theta $$ Therefore $\Bbb{E}\bigg( \exp\bigg\{\theta\frac{Z_n - np}{np_n}\bigg\}\bigg) \to 1$ At this point, I suspect something went wrong in my computations, for I would like to take the minimum over $\theta>0$ and improve my asymptotic bound, in order to find the Large Deviation Principle.
$$\limsup_n \frac{1}{n} P\bigg( \frac{Z_n}{np_n}>a\bigg) = I(a)$$
But that is not happening. What should I do?
You are computing the wrong limit to establish the large deviation result you want. The requested rate is $np_n$. Let me write $C_X(t) = \log \mathbb{E}(e^{Xt})$ for the cumulant generating function. Then Markov's inequality gives
$$\begin{align*} \mathbb{P} \left( \frac{Z_n}{np_n} > a \right) &\le \exp \left( C_{Z_n}(t) - anp_n t \right) \\ &\le \exp \left( - \sup_t \left( anp_n t - C_{Z_n}(t) \right) \right) \\ &\le \exp \left( - np_n \sup_t \left( a t - \frac{1}{np_n} C_{Z_n}(t) \right) \right) \end{align*}$$
so the expression you want the asymptotics of is actually
$$\frac{1}{np_n} C_{Z_n}(t) = \frac{1}{p_n} \log \left( 1 + p_n (e^t - 1) \right)$$
and happily $n$ no longer appears in the exponent. Since $p_n \to 0$ we get that this is
$$\frac{1}{p_n} \left( p_n(e^t - 1) + O(p_n^2) \right) \to e^t - 1$$
as $n \to \infty$, which you might recognize as the cumulant generating function of the Poisson distribution with mean $\lambda = 1$. So Markov's inequality gets us an upper bound with rate function
$$I(a) = \sup_t \left( at - (e^t - 1) \right) = a \log a - a + 1$$
(for positive $a$) which you might recognize if you computed a large deviation principle for the Poisson distribution. This is reminiscent of the Poisson limit theorem.