Here $X_1,\cdots,X_n$ are i.i.d. The two extremes $B=0$ and $B=2$, and the standard case $B = 1$ are illustrated in the picture below. For the reference, see here.
2026-03-28 16:25:53.1774715153
Is the following always true: $\mbox{Var}[\mbox{Range}(X_1,\cdots,X_n)] = O(n^{-B})$ with $0\leq B \leq 2$?
100 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in ASYMPTOTICS
- Justify an approximation of $\sum_{n=1}^\infty G_n/\binom{\frac{n}{2}+\frac{1}{2}}{\frac{n}{2}}$, where $G_n$ denotes the Gregory coefficients
- How to find the asymptotic behaviour of $(y'')^2=y'+y$ as $x$ tends to $\infty$?
- Correct way to prove Big O statement
- Proving big theta notation?
- Asymptotics for partial sum of product of binomial coefficients
- Little oh notation
- Recurrence Relation for Towers of Hanoi
- proving sigma = BigTheta (BigΘ)
- What's wrong with the boundary condition of this $1$st order ODE?
- Every linearly-ordered real-parametrized family of asymptotic classes is nowhere dense?
Related Questions in STATISTICAL-INFERENCE
- co-variance matrix of discrete multivariate random variable
- Question on completeness of sufficient statistic.
- Probability of tossing marbles,covariance
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Using averages to measure the dispersion of data
- Confidence when inferring p in a binomial distribution
- A problem on Maximum likelihood estimator of $\theta$
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Show that $\max(X_1,\ldots,X_n)$ is a sufficient statistic.
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?

Analysis of the range can be reduced to the analysis of extreme values attained by a distribution if it is bounded on one side. For such special cases, extreme value theory can be applied to study the behavior of the tail distributions/extreme order statistics. In particular, Fisher–Tippett–Gnedenko theorem can be applied under certain special structure of the underlying sampling distribution.
$\mathbb{VaR}[Range(X_1,X_2,...,X_n)]$ can be shown to be finite using Extreme Value Theory and convergence to generalized extreme value distributions.
Let $M_n = max(X_1,X_2,...,X_n)$ and $X_n$ follows a distribution bounded on the left. Then if there exists sequence of real numbers $a_n >0, b_n$ such that $\mathbb{Pr}(\frac{M_n -a_n}{b_n} < z) \to G(z)$, then G(z) follows a generalized extreme value (GEV) distribution. There are three possible GEV distribution Weibull, Gumbell and Fréchet distribution which are determined based on the convergence parameters $a_n,b_n$
Weibull, Gumbell distributions have finite variance and Fréchet distribution has finite variance if sampling distribution of $X_n$ has finite variance.
It can be shown that exponential distribution falls under the convergence law of Gumbell distribution with the same variance as mentioned in the comments. Polynomial tail distributions (heavier than exponential) fall under the convergence law of Fréchet distribution.
An example would be the Pareto distribution with polynomial tail which has CDF of,
$F_X(x) = 1 - (x_m/x)^{-\alpha}$ for $x \in [x_m,\infty)$ and $0$ otherwise
If $X_i \sim Pareto(\alpha)$ with $x_m = 1$, then $min(X_1,X_2,...,X_n)$ will converge to $x_m$ in probability as $n \to \infty$ and $Range(X_1,X_2,...,X_n)$ can be approximated as $max(X_1,X_2,...,X_n) - x_m$. Using the Fisher–Tippett–Gnedenko theorem, we can show that probability distribution of $M = \lim_{n \to \infty} max(X_1,X_2,...,X_n)$ converges to the Fréchet distribution with parameter $\alpha$.
The condition on $\alpha$ which makes variance of the Pareto distribution finite, also satisfies the condition to make the variance of the Fréchet distribution finite. Thus falling into the case where $B=0$.
I understand this answer doesn't capture all possible distributions of $X_n$ but the convergence criteria under Extreme Value Theory captures most of the heavy tail distributions used for modeling extreme observations.