Proving a Large Deviation type Upper Bound

178 Views Asked by At

I got stuck on a question about proving a large deviation type upper bound. The questions is: Suppose $X_i$ are i.i.d random variables with finite moment generating functions. Let $H(\alpha) = \log Ee^{\alpha X_1}$ and $L(\beta) = \sup_{\alpha}[\alpha\beta - H(\alpha)]$. For $b \geq EX_1$, does \begin{equation} \lim\sup_{n\rightarrow\infty} \frac1n \log P\left(\max_{i=1,\cdots,n}\frac1nS_i\geq b\right) \leq -L(b) \end{equation} hold?

When $EX_1 \geq 0$, $\forall \alpha \geq 0$, we can construct a submartingale $Y_k = e^{\frac{\alpha}{n}S_k}$. As a result, we have \begin{equation} P\left(\max_{i=1,\cdots,n}\frac1nS_i\geq b\right) = P\left(\max_{i=1,\cdots,n}e^{\frac{\alpha}nS_i}\geq e^{\alpha b}\right) \leq \frac{Ee^{\frac{\alpha}{n}S_n}}{e^{\alpha b}} = \frac{e^{nH(\frac{\alpha}{n})}}{e^{\alpha b}} \end{equation} So \begin{equation} \frac1n \log P\left(\max_{i=1,\cdots,n}\frac1nS_i\geq b\right) \leq -\left(\frac{\alpha}n b - H\left(\frac{\alpha}{n}\right)\right), \forall \alpha \geq 0 \end{equation} Conseqently, \begin{equation} \frac1n \log P\left(\max_{i=1,\cdots,n}\frac1nS_i\geq b\right) \leq -\sup_{\alpha\geq 0}\left(\frac{\alpha}n b - H\left(\frac{\alpha}{n}\right)\right) = -\sup_{\alpha\geq 0}(\alpha b - H(\alpha)) = -L(b) \end{equation} The last equality holds because $b \geq EX_1$. But for the case where $EX_1 < 0$, $Y_k$ is no longer a submartingale, so this argument doesn't apply. I tried to convert this case to $EX_1 \geq 0$ case, but failed. Does anyone know how to prove this in this case? Thanks!

1

There are 1 best solutions below

0
On

The large deviations principle fails in the regime of parameters $\mathbb E(X_1)<b<0$.

To see this, note that the event $A_n=\big[\max\limits_{1\leqslant i⩽n}S_i⩾nb\big]$ is a very probable event since the random walk $(S_i)_{i\geqslant1}$, even though it has a negative bias, starts from $X_1$ hence $\max\limits_{1\leqslant i⩽n}S_i$ is of order $1$ while $nb→−∞$. For example, $A_n⊇[X_1⩾nb]$ and $\mathbb P(X_1⩾nb)→1$ when $n→∞$.