So I've been mulling over a question: If I have a type of object that breaks after a waiting time $T \sim Exp(\lambda)$. Now I'm looking at $m$ of these same objects and I want to know the expected waiting time of the last object to break i.e. $E(T_m)$. My intuition tells me this is a Gamma distribution and so $E(T_m)=\frac{m}{\lambda}$ with a $Var(T_m)=\frac{m}{\lambda^2}$. That being said I'd like to know if I'm wrong and if so how to think about the problem differently. Thanks!
2026-03-27 06:08:54.1774591734
Expected waiting time of last item for a set of m exponential random variables
48 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in EXPONENTIAL-DISTRIBUTION
- Comparing Exponentials of different rates
- Find probability density function for $\varepsilon \cdot X$.
- What is $\mathbb{E}[X\wedge Y|X]$, where $X,Y$ are independent and $\mathrm{Exp}(\lambda)$- distributed?
- Restaurant sending orders every 5 minutes on average
- How to estimate Reliability function in Weibull by the failure rate
- exponential distribution of an exponential variable
- Joint probability density function of $X$ and $\frac{Y}{X}$
- distribution of Z=X+Y
- Probability of two randomly selected leaves of a tree to be connected only at the root
- Reasonable/unreasonable exponentially distributed interarrival (service) times
Related Questions in GAMMA-DISTRIBUTION
- Gamma distribution to normal approximation
- Conditional density function with gamma and Poisson distribution
- sum of two independent scaled noncentral $\chi$-squared random variables
- Expectation of the ratio between Beta-Prime and Gamma random variables
- It is given that $X_i \sim^{\text{ind}} \text{Gamma}(\alpha,p_i)$ Find the distributions of $Y_i=\frac{X_i}{X_1+X_2+...+X_i}$, where $i=2,3,..k$
- Finding the pdf of a random variable generating from another random variable with defined pdf
- Claims per policyholder follows a Poisson dist. but mean varies according to a Gamma distribution
- How to prove the sum of sample is the complete statistics for gamma distribution?
- How to solve or approximate this special integral related to inverse gamma distribution
- Calculating the probability that one analyst is correct over another
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
This is the maximum of a set of i.i.d. exponentially distributed r.v.s $X_1, \dots, X_m$, each with cumulative distribution function $F(x) = 1 - e^{-\lambda x}$ for $x > 0$. Therefore the c.d.f. of $T_m$ is $$ P(T_m \le x) = P(\forall i \, X_i \le x) = \prod_i P(X_i \le x) = F(x)^m = \left(1 - e^{-\lambda x} \right)^m $$ and the probability density function is the derivative of this expression. That is not a gamma distribution. In particular, $E(T_m) = \lambda^{-1}c_m$, where $$ \begin{aligned} c_m &= \int_0^\infty m x e^{-x}\left(1-e^{-x}\right)^{m-1} dx = \sum_{k=0}^{m-1} (-1)^k m \binom{m-1}{k}\int_0^\infty x e^{-x-kx} dx \\ &= \sum_{k=0}^{m-1} (-1)^k \binom{m}{k+1} \frac{1}{k+1} \\ &=\psi^{0}(1+m) + \gamma \end{aligned} $$ where $\psi^{(0)}(z) = \frac{\Gamma'(z)}{\Gamma(z)}$ is the polygamma function of order $0$ and $\gamma \approx 0.577\dots$ is Euler's constant. This means that $E(T_m)\sim (\log m + \gamma)/\lambda$ for large $m$