When reading some probability publications I am always not sure why they call this or that inequality a 'concentration inequality' or 'large deviation inequality'. For me these (concentration of measure and large deviation theory) just describe the same phenomenon. So I ask, is there a formal difference between concentration and large deviation inequalities? Or are these really different concepts?
2026-03-26 01:07:43.1774487263
Concentration of measure vs large deviation
2.8k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in LARGE-DEVIATION-THEORY
- an Optimal control problem : infinite time horizon and free end point
- WKB form, large deviation expansion of the stationary PDF
- Polynomial term for random walk large deviations
- Pontryagin principle, Optimal control or Numerical scheme ? logical constraint?
- Prove that MGF is differentiable at everywhere? and $\lim_{s \rightarrow \infty} \frac{\log M(s)}{s} = \infty$?
- Large deviation principle for Gaussian random variables with mean $0$ and variance $1/n$.
- Good book on large deviations theory
- What does it mean that "the central limit theorem does not hold far away from the peak"?
- Gärtner-Ellis theorem on Markov chains
- Why is the rate function given by Schilder's theorem infinite outside of CM space? Can we understand Schilder's theorem through CM theorem?
Related Questions in CONCENTRATION-OF-MEASURE
- Improved Bennet Inequality for Vector-Valued RV
- Concentration of the norm (sub-gaussianity)
- A simple proof of McDiarmid's inequality?
- On the 1/2 assumption on concentration of measure on continuous cube
- Concentration inequalities for supremum of moving average
- A problem of proving that a certain concentration inequality cannot be improved
- To establish an inequality using Chebyshev's probability bound
- Concentration inequalities on the supremum of average
- Books about exponential tilting
- Hoeffding's inequality for dependent random variable
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
From my point of view concentration inequalities and large deviation theory are indeed different concepts - but since there are a lot of well-known concentration inequalities which have the form of large deviation estimates it is not surprising that there is sometimes no strict distinction between both notions.
Concentration inequalites are inequalities that bound probabilies of deviations by the random variable $X$ from its mean or median, i.e. upper bounds for probabilities of the form
$$\mathbb{P}\bigg( |X-\mathbb{E}X)| > r \bigg) \quad \text{or} \quad \mathbb{P}\bigg( |X-m(X)| > r \bigg)$$
where $m(X)$ denotes the median of $X$. Some basic concentration inequalites are the Markov inequality,
$$\mathbb{E}(|X| \geq r) \leq \frac{\mathbb{E}(|X|)}{r},$$
and the Tschebysheff inequality,
$$\mathbb{E}(|X-\mathbb{E}X| \geq r) \leq \frac{\text{var} \, X}{r^2}.$$
Often one is interested in the asymptotics of such probabilites, i.e. the asymptotic decay of
$$\mathbb{P}\bigg( |X_n-\mathbb{E}(X_n)| > r \bigg)$$
as $n \to \infty$. Now here comes large deviation theory into play: Large deviation theory yields asymptotic estimates of the form
$$\mathbb{P}\bigg( |X_n-\mathbb{E}(X_n)| > r \bigg) \leq e^{-n I(r)} \tag{1}$$
where $I(r) \geq 0$ is called the rate function. The Chernoff bound, Hoeffding inequality or McDiarmid's inequality are well-known results of this type. This means that large deviation theory provides estimates on an exponential scale. However, concentration inequalities do not need to be of the form $(1)$. A different approach uses so-called concentration functions.
Let me finally mention that large deviation theory deals more generally with the asymptotic behavior of probabilites of the form
$$\mathbb{P} \bigg( X_n \in A \bigg),$$
for some fixed set $A$, on an exponential scale, i.e.
$$\mathbb{P} \bigg(X_n \in A \bigg) \approx \exp \bigg(-n \cdot I(A) \bigg).$$
Obviously, the estimates in $(1)$ are a particular case if we choose $A=[\mathbb{E}(X_n)-r,\mathbb{E}(X_n)+r]$.