Let us assume that $X_1,..., X_n$ are i.i.d exponential random variables with $\lambda$ each. Define $M = \max\{X_1,..., X_n\}$. If I'm interested in $\mathbb{E}M$, then I can divide $M$ into (independent?) incerements, namely $$ M = X_{(1)}^{(n)}+ X_{(1)}^{(n-1)} + ... + X_{(1)}^{(1)}, $$ where $X_{(1)}^{(n)}$ is $\min \{X_1,.., X_n\}$ and $X_{(1)}^{(n-1)}$ is the minimum of the rest $n-1$ random variables and so on. Due to the lack of memory property each $X_{(1)}^{(k)} \sim \mathcal{E}xp (k\lambda)$. Hence, $$ \mathbb{E}M = \sum_{k=1}^n\frac{1}{k\lambda}, $$ and $$ \mathbb{V}(M)=\sum_{k=1}^n\frac{1}{k^2\lambda^2}, $$ that means where $n \to \infty$ the variance converges to $(\pi/\lambda)^2/6$ but the expectation diverge. That is nonsense. What am I missing? Are those increments dependent and then I have to account for the co-variance that will "explode"? Thank you.
2026-03-30 20:48:56.1774903736
Dependent increments?
216 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Related Questions in POISSON-PROCESS
- Meaning of a double integral
- planar Poisson line process & angles of inclination
- In the Poisson process $N,$ find $\operatorname E[2^{N(t)}e^{-\lambda t} \mid N(s) = k]$ and $\operatorname{Var}(N(t) \mid N(s) = k)$.
- Probability Bookings in a Hotel
- Fitting Count Data with Poisson & NBD
- Expected value mixed poisson process
- Convergence of iid random variables to a poisson process
- Poisson process - 2D
- To prove that $X(t) = N(t+L) - N(t) , L > 0$ is Covariance stationary given $\{N(t) | t \geq 0\}$ is a Poisson Process.
- Poisson point process characterized by inter-arrival times
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
You can think of the independent exponentials as wait times to the first click of $n$ independent Poisson processes. The maximum is the total time till the last one clicks.
Say the first $n-1$ have clicked. Then by memorylessness, the remaining time till the final click is exponential and independent of everything else. If the first $n-2$ have clicked, then the remaining time till the $n-1$-st one clicks is the minimum of two exponentials, independent of everything else. So continuing backward inductively, the time till the final click is a sum of $n$ independent variables, where the first is distributed like the minimum of $n$ exponentials, the second is distributed like the minimum of $n-1$ exponentials, and so on.
EDIT
I realize I forgot to answer your whole question. Hopefully it's clear from above that you can use independence in the way you wanted to and that your expressions are correct. There is no contradiction that the expectation diverges as $n\to\infty$ while the variance goes to a constant. Imagine you had $X_n,$ a sequence of normal random variables where $X_n$ has mean $n$ and variance one. The mean diverges while the variance converges. There is absolutely no problem with this.
I think you're confusing it with a case where a single random variable has infinite expectation but finite variance which would indeed be impossible. But here each of the random variables $M_n = \max(X_1,\ldots, X_n)$ has a finite expectation and a finite variance. It's just that as the sequence goes on the means get bigger and bigger while the variances don't.