This is Lyapunov's inequality for moments of a random variable (the paper can be accessed here):
Let $\{F_{k}\}, k = 1, 2, ..., N$ be an arbitrary sequence of events in $(\Omega, F, P)$.
We have, if $P\left(\bigcup\limits_{k=1}^{n} F_k \right) > 0$,
(1) $$2 \sum_{1\leq j < k \leq N} P(F_{j}F_k) \geq \Bigg[P\bigg(\bigcup_{k=1}^N F_k\bigg)\Bigg]^{-1}\Bigg(\sum_{k=1}^N P(F_k)\Bigg)^{2} - \sum_{k=1}^N P(F_k)$$
Proof: Define r.v. $ X_k(\omega)= \begin{cases} 0, & \text{if $\omega \notin F_k$} \\[2ex] 1, & \text{if $\omega \in F_k$} \end{cases}$
The following identity is evident:
(2) $2 \sum\limits_{1\leq j < k \leq N} P(F_j F_k) = E\left[\left(X_1+...+X_N\right)^{2}\right] - E\left(X_{1}^{2} +...+ X_{N}^{2}\right)$
Now by the Schwarz inequality we have
(3) $[E(X_1+...+X_N)]^2 \leq P(X_1+...+X_{N}>0)E[(X_{1}+...+X_N)]^2$
Since $E(X_k) = E(X_{k}^{2}) = P(F_k), $
$P(X_1+...+X_N>0) = P\left(\bigcup\limits_{k=1}^{N} F_k\right)$ by definition, (1) follows from (2) and (3)
How can I strengthen Lyapunov's inequality for moments of a random variable so that this new inequality:
$$P\bigg(\bigcup_{k=n}^{N} A_k\bigg) \geq \frac{\big(\sum_{k=n}^{N} P(A_k)\big)^{2}}{\sum_{k,j=n}^{N} P(A_k A_j)}$$ holds?
The inequality $$2 \sum_{1\leq j < k \leq N} P(F_{j}F_k) \geq \Bigg[P\bigg(\bigcup_{k=1}^N F_k\bigg)\Bigg]^{-1}\Bigg(\sum_{k=1}^N P(F_k)\Bigg)^{2} - \sum_{k=1}^N P(F_k)$$ can be rewritten as \begin{align*} P\bigg(\bigcup_{k=1}^N F_k\bigg) \ge \frac{\Bigg(\sum_{k=1}^N P(F_k)\Bigg)^{2}}{2 \sum_{1\leq j < k \leq N} P(F_{j}F_k) + \sum_{k=1}^N P(F_k)} \end{align*} Note that \begin{align*} \color{red}{2 \sum_{1\leq j < k \leq N} P(F_{j}F_k)} + \color{blue}{\sum_{k=1}^N P(F_k)} &= \color{red}{\sum_{j \neq k} P(F_{j}F_k)} + \color{blue}{\sum_{j = k}P(F_jF_k)} \\ &=\sum_{j,k} P(F_{j}F_k) \end{align*} Note that the red expressions equal because $P(F_j F_k) = P(F_k F_j)$, but since $j < k$ in the first sum, the $2$ doubles for $j > k$ as well. The blue expressions equal because $P(F_k \cap F_k) = P(F_k)$.
And so \begin{align*} P\bigg(\bigcup_{k=1}^N F_k\bigg) \ge \frac{\Bigg(\sum_{k=1}^N P(F_k)\Bigg)^{2}}{\sum_{j,k=1}^{N}P(F_j F_k) } \end{align*} We can reindex the lower index of $1$ to $n$, and change variables $F$ to $A$, and we have the desired final inequality.
Proving $P(\limsup_n A_n) \ge 1/C$ with this bound.
Here is the result I will prove:
Proof. Let $a_n = (\sum_{k=1}^{n}P(A_k))^2$ and $b_n = \sum_{i,j=1}^n P(A_i A_j)$. By assumption, $a_n \rightarrow \infty$, and by Chung-Erdos, so does $b_n \rightarrow \infty$. Note that $(\sum_{k=m+1}^{n}P(A_k))^2 = (\sqrt{a_n}-\sqrt{a_m})^2$ and \begin{align*} \sum_{i,j = m+1}^{n}P(A_iA_k) = b_n - b_m - \sum_{i=1}^{n}\sum_{j=m+1}^{n}P(A_iA_j) - \sum_{i=1}^{n}\sum_{j=m+1}^{n}P(A_iA_j) \le b_n - b_m \end{align*} And so \begin{align*} P\left(\bigcup_{k=m+1}^{\infty}A_k\right) = \lim_{n\rightarrow \infty}P\left(\bigcup_{k=m+1}^{n}A_k\right) \ge \limsup_{n\rightarrow \infty}\frac{(\sqrt{a}_n-\sqrt{a}_m)^2}{b_n - b_m} = \limsup_{n\rightarrow \infty}\frac{a_n}{b_n} \end{align*} And so $P\left(\limsup_n A_n\right) = \lim_{m\rightarrow \infty}P\left(\bigcup_{k=m+1}^{\infty}A_k\right) \ge \limsup_{n\rightarrow \infty}\frac{a_n}{b_n}.$