Consider a mixture of measures $Q$ given by $$ Q = \sum_{k=1}^{N} p_k D_k $$ where $D_k$ is a probability measure with known associated (finite) mean $\mu_k$ and variance $\sigma_k^2$ such that each $D_k$ independent of one another and $$ \sum_{k=1}^{N} p_k = 1 $$ Using the standard approach of taking expectations, it can easily be shown that the variance of this mixture of probability measures is given by $$ \text{Var}(Q) = \sum_{k=1}^{N} p_k (\sigma_k^2 + \mu_k^2) - \left( \sum_{k=1}^{N} p_k \mu_k \right)^2 $$ I am attempting to obtain this result using the Law of Total Variance, which states that for independent random variables $X$ and $Y$ $$ \text{Var}(Y) = \mathbb{E}\left[ \text{Var}\left( Y \vert X \right) \right] + \text{Var}\left( \mathbb{E}\left[ Y \vert X \right] \right) $$ but so far I have been failing miserably. My first thought was to write $$ \text{Var}(Q) = \mathbb{E}\left[ \text{Var}\left( \left. \sum_{k=1}^{N} p_k D_k \right\vert D_j \right) \right] + \text{Var}\left( \mathbb{E}\left[ \left. \sum_{k=1}^{N} p_k D_k \right\vert D_j \right] \right) $$ and then determine that $$ \text{Var}\left( \left. \sum_{k=1}^{N} p_k D_k \right\vert D_j \right) = \text{Var}\left( \sum_{k \neq j}^{N} p_k D_k \right) $$ and $$ \mathbb{E}\left[ \left. \sum_{k=1}^{N} p_k D_k \right\vert D_j \right] = p_j D_j + \sum_{k \neq j}^{N} p_k \mu_k $$ to finally get $$ \text{Var}(Q) = \mathbb{E}\left[ \text{Var}\left( \sum_{k \neq j}^{N} p_k D_k \right) \right] + p_j^2 \sigma_j^2 $$ I could then repeat this process to find $$ \text{Var}\left( \sum_{k \neq j}^{N} p_k D_k \right) $$ and have repeated expectations, but this would eventually result in $$ \text{Var}(Q) = \sum_{k=1}^{N} p_k^2 \sigma_k^2 $$ which is certainly incorrect. I think the issue with this attempt is that I am simply treating these probability measures $D_k$ as random variables to apply the form of the Law of Total Variance that I know, but this must be wrong. How may I apply the Law of Total Variance to a mixture of probability measures, or perhaps expand the idea of the Law of Total Variance?
2026-04-01 09:38:15.1775036295
Using Law of Total Variance to find the Variance of Mixture Distribution
219 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in EXPECTED-VALUE
- Show that $\operatorname{Cov}(X,X^2)=0$ if X is a continuous random variable with symmetric distribution around the origin
- prove that $E(Y) = 0$ if $X$ is a random variable and $Y = x- E(x)$
- Limit of the expectation in Galton-Watson-process using a Martingale
- Determine if an Estimator is Biased (Unusual Expectation Expression)
- Why are negative constants removed from variance?
- How to find $\mathbb{E}(X\mid\mathbf{1}_{X<Y})$ where $X,Y$ are i.i.d exponential variables?
- $X_1,X_2,X_3 \sim^{\text{i.i.d}} R(0,1)$. Find $E(\frac{X_1+X_2}{X_1+X_2+X_3})$
- How to calculate the conditional mean of $E(X\mid X<Y)$?
- Let X be a geometric random variable, show that $E[X(X-1)...(X-r+1)] = \frac{r!(1-p)^r}{p^r}$
- Taylor expansion of expectation in financial modelling problem
Related Questions in VARIANCE
- Proof that $\mathrm{Var}\bigg(\frac{1}{n} \sum_{i=1}^nY_i\bigg) = \frac{1}{n}\mathrm{Var}(Y_1)$
- $\{ X_{i} \}_{i=1}^{n} \thicksim iid N(\theta, 1)$. What is distribution of $X_{2} - X_{1}$?
- Reason generalized linear model
- Variance of $\mathrm{Proj}_{\mathcal{R}(A^T)}(z)$ for $z \sim \mathcal{N}(0, I_m)$.
- Variance of a set of quaternions?
- Is the usage of unbiased estimator appropriate?
- Stochastic proof variance
- Bit of help gaining intuition about conditional expectation and variance
- Variance of $T_n = \min_i \{ X_i \} + \max_i \{ X_i \}$
- Compute the variance of $S = \sum\limits_{i = 1}^N X_i$, what did I do wrong?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $Z$ be a random variable such that $\mathbb{P}(Z = k) = \pi_k$ and let $X \sim Q$. From this, we have $X \vert Z \sim \text{N}\left( \mu(Z), \sigma^2(Z) \right)$, which is to say $X \vert (Z = k) = \text{N}(\mu_k, \sigma_k^2)$. Applying the total variance identity, we have \begin{align*} \text{Var}(X) &= \mathbb{E}\left[ \text{Var}(X \vert Z) \right] + \text{Var}\left( \mathbb{E}[X \vert Z] \right) \\ &= \mathbb{E}\left[ \sigma^2(Z) \right] + \text{Var}\left( \mu(Z) \right) \\ &= \sum_{k=1}^{N} \pi_k \sigma_k^2 + \sum_{k=1}^{N} \pi_k (\mu_k - \overline{\mu})^2 \\ &= \sum_{k=1}^{N} \pi_k \left( \sigma_k^2 + \mu_k^2 \right) - \left( \sum_{k=1}^{N} \pi_k \mu_k \right)^2 \\ \end{align*} which is precisely the previously derived result.