Etemadi's inequality

1.2k Views Asked by At

In another post an inequality referred to as "Etemadi's Inequality" is mentioned twice - in the original post as well as in the answer. However, the contexts of usage are such as to raise the question whether the inequality intended by the users (ziT and saz, respectively) is the inequality that goes by the same name as features on Wikipedia.

More specifically, according to the Wikipedia entry mentioned above, Etemadi's inequality is the following statement (proved as Theorem 22.5 on p. 288 of Billingsley's "Probability and Measure:, 3rd ed. (John Wiley & Sons, 1995)).

If $X_1, X_2, \dots, X_n$ are independent real-valued random variables defined on some common probability space, then, setting $S_k := \sum_{i=1}^k X_i$ ($k \in \{1, 2, \dots, n\}$), the following holds for every $\alpha \geq 0$: $$ \mathbb{P}\left(\max_{k \in \{1, 2, \dots, n\}}\left|S_k\right| \geq 3\alpha\right) \leq 3 \max_{k \in \{1, 2, \dots, n\}} \mathbb{P}\left(\left|S_k\right|\geq \alpha\right) $$

According to ziT,

Given a Lévy process $(X_t)_{t \in [0,\infty)}$ in $\mathbb{R}^d$ ($d \in \{1, 2, \dots\}$), then, setting $X^*_t := \sup_{s\in [0,t]}\left|X_s\right|$, if $b>0$ is such that $P[X_{t}^{*}\leq b/2]>0$, we have by Etemadi's inequality for every $a, b > 0$: $$ P[X_{t}^{*}>a+b]\leq \frac{P[|X_{t}|>a]}{P[X_{t}^{*}\leq b/2]} $$

According to saz,

Given a Lévy process $(X_t)_{t \in [0,\infty)}$ in $\mathbb{R}^d$ ($d \in \{1, 2, \dots\}$), then, setting $X^*_t := \sup_{s\in [0,t]}\left|X_s\right|$, if $b>0$ is such that $\mathbb{P}(X_t^* \leq b/2)>0$, we get by Etemadi's inequality for $a=kb$, $k \in \{1, 2, \dots\}$: $$ \mathbb{P}(X_t^* > k b) \leq c \mathbb{P}(|X_t| > (k-1)b) $$ with $c:= 1/\mathbb{P}(X_t^* \leq b/2)$.

My questions is:

Can the Wikipedia version of Etemadi's inequality be used to derive the conclusions mentioned by ziT and by saz? If not, what might be the proposition referred to as "Etemadi's inequality" by ziT and by saz?

1

There are 1 best solutions below

12
On BEST ANSWER

The inequality which both ziT and I used is a direct consequence of the following inequality.

Lemma 1: Let $X_1,\ldots,X_n$ be independent random variables and $S_k := \sum_{j=1}^k X_j$, $k=1,\ldots,n$. Then for any $a,b \geq 0$ $$\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j|>a+b \right) \leq \frac{\mathbb{P}(|S_n|>a)}{\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j| \leq b/2 \right)}. \tag{1}$$ Here (and throughout this answer) we use the convention $1/0=\infty$.

The proof is very similar to the proof of Etemadi's inequality, but as far as I can see Lemma 1 is not a direct consequence of Etemadi's inequality (see the remark below).

Proof: Fix $a,b \geq 0$. For the disjoint sets $$A_j := \left\{ \max_{1 \leq k <j} |S_k| \leq a+b, |S_j| > a+b \right\}, \qquad j=1,\ldots,n$$ we have $$ \left\{ \max_{1 \leq j \leq n} |S_j| > a+b \right\} = \bigcup_{j=1}^n A_j.$$ Consequently, by the independence of the random variables, $$\begin{align*} \mathbb{P}\left( \max_{1 \leq j \leq n} |S_j| > a+b \right) &\leq \mathbb{P}(|S_n| > a) + \sum_{j=1}^{n-1} \mathbb{P}(A_j \cap \{|S_n| \leq a\}) \\ &\leq \mathbb{P}(|S_n| > a) + \sum_{j=1}^{n-1} \mathbb{P}(A_j) \mathbb{P}(|S_n-S_j|>b) \\ &\leq \mathbb{P}(|S_n| > a) + \mathbb{P}\left( \max_{1 \leq j \leq n} |S_j| > a+b \right) \max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \tag{2} \end{align*}$$ Hence,

$$ \mathbb{P}\left( \max_{1 \leq j \leq n} |S_j| > a+b \right) \left(1-\max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \right) \leq \mathbb{P}(|S_n| > a). \tag{3}$$

As

$$\max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \leq \mathbb{P} \left( \max_{1 \leq j \leq n} |S_j| > \frac{b}{2} \right)$$

we have

$$1- \max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \geq \mathbb{P} \left( \max_{1 \leq j \leq n} |S_j| \leq \frac{b}{2} \right).$$

Using this estimate and $(3)$, the claim follows.

Remark: If we use in $(2)$ the estimate

$$\max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j| >b) \leq 2 \max_{1 \leq j \leq n} \mathbb{P}(|S_j|>b/2)$$ and the fact that $$\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j|>a+b \right) \leq 1,$$ then the we get Etemadi's inequality (choose $a= \alpha$, $b=2\alpha$):

$$\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j|>3\alpha \right) \leq 3 \max_{1 \leq j \leq n} \mathbb{P}(|S_j| > \alpha).$$

Lemma 2: Let $(X_t)_{t \geq 0}$ be a Lévy process and $X_t^* := \sup_{s \leq t} |X_s|$ the running maximum. Then for any $a,b>0$ and $t>0$

$$\mathbb{P}(X_t^* > a+b) \leq \frac{\mathbb{P}(|X_t|>a)}{\mathbb{P}(X_t^* \leq b/2)}.$$

Proof: For fixed $n \in \mathbb{N}$ and $t>0$ set $t_j := t \frac{j}{2^n}$, $j \in \{0,\ldots,2^n\}$. The random variables $(X_{t_j}-X_{t_{j-1}})_{j=1,\ldots,2^n}$ are independent, and therefore it follows from Lemma 1 that

$$\mathbb{P}(X_{(n)}^* > a+b) \leq \frac{\mathbb{P}(|X_t|>a)}{\mathbb{P}(X_{(n)}^* \leq b/2)}$$

where $X_{(n)}^* := \max_{1 \leq j \leq 2^n} |X_{t_j}|$. Since $(X_t)_{t \geq 0}$ has (almost surely) càdlàg sample paths, the claim follows by letting $n \to \infty$.

Lemma 3: Let $(X_t)_{t \geq 0}$ be a Lévy process and $X_t^* := \sup_{s \leq t} |X_s|$. Then $$\mathbb{P}(X_t^* > kb) \leq c \mathbb{P}(|X_t|>(k-1)b)$$ for any $b,k \geq 0$ where $c := 1/\mathbb{P}(X_t^* \leq b/2)$.

Proof: This follows directly from Lemma 2 applied with $a:= (k-1)b$.