I know that if the moment generating function of two distribution converges to the same function then the two distribution converges in CDF. But how can we prove this thing explicitly ?
2026-04-12 11:35:49.1775993749
How to prove that convergence in MGF implies Convergence in Distribution?
6.7k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I'm not really sure what you mean by "two distributions" and then use the word "converges". Here's what I assume: let $\mu_n$ and $\mu$ be probability measures on the real half-line, $[0,\infty)$. Let $L_n$ and $L$ be their moment generating functions, $L_n(t) = \int_{x \geq 0} \exp(-tx) d\mu(x).$ We want to know whether $L_n(t) \rightarrow L(t)$ for every $t \geq 0$ implies $\mu_n \Rightarrow \mu$, where $\Rightarrow$ denotes weak convergence. (Equivalently for measures on the line, $\Rightarrow$ means $F_n(x) \rightarrow F(x)$ for any continuity point of $F$.) I'll sketch a proof of this statement, which I learned from Billingsley's Convergence of Probability measures, Example 5.5.
The idea of the proof is an application of Prohorov's theorem. A family of probability measures $\mathcal{F}$ is said to be tight if for any $\epsilon > 0$, there is a compact set $K \subset \mathbb{R}$ such that $$\mu(K) > 1-\epsilon$$ for all $\mu \in F$.
Prohorov's theorem says that if the family $\mathcal{F}$ is tight, then it is relatively compact. That is, for any sequence $\{\mu_n\}_{n \geq 1} \subset \mathcal{F}$, there is a subsequence $\mu_{n_k}$ such that $\mu_{n_k} \Rightarrow \mu$ for some probability measure $\mu$ in the closure of $\mathcal{F}$.
The moment generating function of a random variable $X$ with distribution $\mu$ on $\mathbb{R}_{\geq 0}$ is given by $$L(t) = \int_0^\infty e^{-tx}d\mu(x).$$ Suppose that $L_n(t) \rightarrow L(t)$ pointwise for all $t \geq 0$. Note that \begin{align*} \frac{1}{u}\int_0^u (1 - L(t)) dt &= \frac{1}{u} \int_{x \geq 0}\int_0^u 1 - e^{-tx} dt d\mu(x) \tag{$\int d\mu = 1$}\\ &\geq \frac{1}{u} \int_{x \geq 1/u} \int_0^u 1 - e^{-tx} dt d\mu(x) \\ &\geq \frac{1}{u} \int_{x \geq 1/u} \int_0^u 1 - e^{-t/u} dt d\mu(x) \tag{monotonicity} \\ &= \int_{x \geq 1/u} e^{-1} d\mu(x) = e^{-1}\mu((1/u, \infty)). \end{align*} Note that by continuity of $L(t)$ at $t = 0$, we can choose a $u_0$ so small as to make $$B_u(L) = u^{-1}\int_0^u 1-L(t) dt$$ as small as we please. Moreover, since $L_n(t) \rightarrow L(t)$, we can make $|B_u(L) - B_u(L_n)|$ as small as we please for sufficiently large $n$ (because of Dini's theorem : pointwise convergence implies uniform convergence when the limiting function is continuous). Pick such a large $N$, and for $1 \leq n \leq N$, choose $u_n > 0$; take the max over these along with $u_0$ and call it $u$. We then choose the compact set $[0,1/u]$. This proves the family of measures is tight, and so there is a subsequence $F_{n_i}(x)$ which converges to some cdf $G(x)$ at all continuity points of $G$. It remains to show that $G$ is in fact $F$.
However, since the mgf $L_G$ of $G$ is also the limit of the $L_n$ (by assumption), it follows that $L_G = L$ and therefore $G = F$ by uniqueness of mgf.