Suppose $X$ and $N$ are independent Gaussian with different variance but N has zero mean. Now $Y = X+N$. I am trying to find out the minimum mean square error estimator for $X$ given $Y$. I set the estimator to be the expected value of $X$ given $Y = y$ and doing the integral figuring out the estimator is $p^2Y$, where p is the ratio of standard deviation of $x$ and $y$ (not sure if this is even correct). Now I am trying to find out its MMSE. The book shows MMSE is the variance of $x$ multiplying by $1-p^2$, however I couldn't get rid of the expectation of $X$ when I plug in my estimator function. So could someone give me an derivation of this answer?
2025-01-12 23:48:09.1736725689
Derivation of MMSE from an estimator of two Gaussians
586 Views Asked by Tom https://math.techqa.club/user/tom/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- Given a random variable X, prove f(x) is a probability function.
- About conditional density function.
- Expected value & Lebesgue integral
- Feeling overwhelmed with this math explanation, what videos, courses, and books should I learn to help me digest this?
- In how many ways can 8 people be seated in a row? What is the probability that Al sits next to Bob?
- How to find probability after finding the CDF of a max?
- Does taking the limit of the inclusion-exclusion formula work here?
- What is the probability that the accumulated sum is L at some point?
- What is the probability of a power set of the sample space?
- Combinatorics Problem - Clients using two seperate services
Related Questions in STATISTICS
- Finding the value of a constant given a probability density function
- How to find probability after finding the CDF of a max?
- Is the row space of a matrix (order n by m, m < n) of full column rank equal to $\mathbb{R}^m$?
- Rate of growth of expected value of maxima of iid random variables under minimal assumptions
- Lower bound for the cumulative distribution function on the left of the mean
- Hypergeometric Probability
- $\mathbb E[(\frac{X+1}{4}-\theta)^2]=?$
- What value of $\alpha$ makes $\sum_{i=0}^n (x_i-\alpha)^2$ minimum?
- The distribution of fourier coefficients of a Rademacher sequence
- small sample (1 point) MLE estimation
Related Questions in MEAN-SQUARE-ERROR
- Finding the closest vector to an observation
- Matlab code for finding the curvature of a curve using given data points
- How to minimize the minimum mean square error of this difference
- Negative Mean Square Error
- A way to calculate the error of a model?
- On normalized error measures
- Is $Φ^T$ a linear operator which transforms simultaneous equations such that we obtain LMS solution?
- Does Least Squares Regression Minimize the RMSE?
- Derivation of MMSE from an estimator of two Gaussians
- how to find mininimum $f(x)$ using $\int_{-\infty}^{\infty} f(x)g(x)dx$?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Refuting the Anti-Cantor Cranks
- Find $E[XY|Y+Z=1 ]$
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- What are the Implications of having VΩ as a model for a theory?
- How do we know that the number $1$ is not equal to the number $-1$?
- Defining a Galois Field based on primitive element versus polynomial?
- Is computer science a branch of mathematics?
- Can't find the relationship between two columns of numbers. Please Help
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- A community project: prove (or disprove) that $\sum_{n\geq 1}\frac{\sin(2^n)}{n}$ is convergent
- Alternative way of expressing a quantied statement with "Some"
Popular # Hahtags
real-analysis
calculus
linear-algebra
probability
abstract-algebra
integration
sequences-and-series
combinatorics
general-topology
matrices
functional-analysis
complex-analysis
geometry
group-theory
algebra-precalculus
probability-theory
ordinary-differential-equations
limits
analysis
number-theory
measure-theory
elementary-number-theory
statistics
multivariable-calculus
functions
derivatives
discrete-mathematics
differential-geometry
inequality
trigonometry
Popular Questions
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- How to find mean and median from histogram
- Difference between "≈", "≃", and "≅"
- Easy way of memorizing values of sine, cosine, and tangent
- How to calculate the intersection of two planes?
- What does "∈" mean?
- If you roll a fair six sided die twice, what's the probability that you get the same number both times?
- Probability of getting exactly 2 heads in 3 coins tossed with order not important?
- Fourier transform for dummies
- Limit of $(1+ x/n)^n$ when $n$ tends to infinity
You can use the fact that the linear MMSE estimator of $X$ given $Y$ is $E[X] + cov(X,Y) cov(Y,Y)^{-1} (Y- E[Y])$ and for jointly Gaussian $X,Y$, the linear MMSE estimator and MMSE estimator coincide (*). In this case, it is clear that $Y$ and $X$ are jointly Gaussian (as any linear combination of them is clearly Gaussian).
In this case, it is easy to calculate out $E[Y] = E[X]+E[N]$, $cov(Y,Y) = var(Y) = var(X+N) = var(X) + var(N)$ by independence, and $cov(X,Y) = cov(X,X+N) = cov(X,X)+cov(X,N) = cov(X,X) + 0 = var(X)$.
To calculate out the MSE, you can use the fact (shown in sec. 3.3 of (*) ) for the linear mmse estimator's mse to be $cov(X,X) - cov(X,Y) cov(Y,Y)^{-1} cov(Y,X)$. It is easy to calculate $cov(Y,X) = cov(X+N,X) = cov(X,X) + cov(N,X) = cov(X,X) = var(X)$ by independence.
(*) For proof, see for example, Random Processes for Engineers by B. Hajek, freely avaliable here, sections 3.3 and 3.4