Concentration for Markov process with known expectation and variance?

107 Views Asked by At

I have a Markov process $\{X_n\}$ taking values in $[0,1]$. For any $k$ I can calculate the expectation $\mathbb{E}(X_k)$ and can bound the variance $\mathbb{V}(X_k)$. If its important, $\mathbb{E}(X_k)$ is approximately $1-\frac{1}{2^k}$ and $\mathbb{V}(X_k)$ is approximately $\frac{1}{2^{k+1}}$. I would like to obtain good upper tail inequalities (bounding $\mathbb{P}(X_k-\mathbb{E}(X_k)\geq c$) for $c\in \mathbb{R}$). My current plan is to just use the variance and apply Markov's inequality. However, I was wondering if I have enough information to use "fancier" concentration of measure tools, or if I can somehow leverage the information I have to control higher moments?

1

There are 1 best solutions below

0
On BEST ANSWER

The simplest "fancier" concentration inequalities all apply the same idea as Markov's inequality. Roughly speaking, for a random variable $X$, \begin{align*} P(|X| \geq a) &\leq E|X| / a \tag{Markov} \\ P(|X| \geq a) = P(X^2 > a^2) &\leq EX^2/a^2 \tag{Chebyshev}\\ P(X \geq a) = P(e^{tX} \geq e^{ta}) &\leq E[e^{tX}] e^{-ta} \tag{Chernoff}. \end{align*} The idea is mostly the same: use the fact for any non-decreasing, non-negative function $f$ and $t>0$, we have the inclusion of events $\{X \geq x\} \subset \{f(X) \geq f(x)\}$. The additional idea of the Chernoff bound is to include a positive multiplier $t$ and use $\{f(X) \geq f(x)\} = \{t f(X) \geq t f(x)\}$. Then, after applying the original Markov inequality, you can optimize over $t$ to get a tighter bound.

The choice of function $f$ that you can use depends on how many moments you have. The more moments you use, the tighter the estimate. From your expressions for the variance, it seems like the moments of your Markov chain decay quickly. It would be ideal if you could compute the Laplace transform $E[e^{t X_k}]$, e.g.

$$P(X_k \geq c') \leq E[e^{t X_k}] e^{-t c'},$$ where $c' = c+EX_k$.