Suppose I have $$ \begin{align} p(x_1) &= N(x_1; 0, 1) \\ p(x_2 \mid x_1) &= N(x_2; x_1, 1) \end{align} $$ How do I compute $p(x_1 \mid x_2)$? I know how to compute their product, giving $N\left(\frac{x_1}{2}, \frac{1}{2}\right)$. The answer of the exercise is $N\left(\frac{x_2}{2}, \frac{1}{2}\right)$, so with $x_2$ rather than $x_1$ in the mean.
2026-02-23 22:22:37.1771885357
Product of marginal Gaussian and conditional Gaussian
927 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in NORMAL-DISTRIBUTION
- Expectation involving bivariate standard normal distribution
- How to get a joint distribution from two conditional distributions?
- Identity related to Brownian motion
- What's the distribution of a noncentral chi squared variable plus a constant?
- Show joint cdf is continuous
- Gamma distribution to normal approximation
- How to derive $E(XX^T)$?
- $\{ X_{i} \}_{i=1}^{n} \thicksim iid N(\theta, 1)$. What is distribution of $X_{2} - X_{1}$?
- Lindeberg condition fails, but a CLT still applies
- Estimating a normal distribution
Related Questions in STATISTICAL-INFERENCE
- co-variance matrix of discrete multivariate random variable
- Question on completeness of sufficient statistic.
- Probability of tossing marbles,covariance
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Using averages to measure the dispersion of data
- Confidence when inferring p in a binomial distribution
- A problem on Maximum likelihood estimator of $\theta$
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Show that $\max(X_1,\ldots,X_n)$ is a sufficient statistic.
Related Questions in GAUSSIAN
- How to fit a Gaussian approximation to the likelihood curve at maximum?
- How can I find percentile $P_{10}$ and $P_{90}$ for Normal Distribution with Mean as $100$ and Standard Deviation as $3$?
- Give probability space $(\Omega,F,\mathbb P)$ & random variable $X:\Omega \to \mathbb R$ on $(\Omega,F,\mathbb P)$ so $X$ has normal distribution.
- Analyticity of determinant formula for Gaussian integral
- Searching for a second order ODE whose solution is bell shape (Gaussian function)
- Expectation: sigmoid times mixture of Gaussians
- Joint Gaussian distribution implies Gaussian + independence?
- how was the gaussian distribution developed? (question of an answer already done)
- A uniform distributed random vector on euclidean ball is sub gaussian
- Predictive distribution of SPGP
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
The following results work for multivariate Gaussians and can be used for univariate Gaussians as well. I came across them from one of the following texts (perhaps both):
Result #1: If random variables $x \in \mathbb R^n$ and $y \in \mathbb R^m$ have the Gaussian distributions $$x \sim \mathcal N (\mu, \Sigma)$$ $$y \,|\, x \sim \mathcal N (A x + b, \Omega)$$ then the joint distribution of $x, y$ $$\begin{pmatrix} x \\ y \end{pmatrix} \sim \mathcal{N} \left( \begin{pmatrix} \mu \\ A \mu + b \end{pmatrix}, \begin{pmatrix} \Sigma & \Sigma A^{\top} \\ A \Sigma & A \Sigma A^{\top} + \Omega \end{pmatrix} \right)$$
You can use result #1 to find the marginal distribution of $x_2$. Then you can use the definition of conditional probability to find the desired pdf: $$p(x_1 \,|\, x_2) = \frac{p (x_1, x_2)}{p (x_2)} = \frac{p(x_2 \,|\, x_1) p(x_1)}{p(x_2)}.$$
Using result #1 above, with $\mu = 0, \Sigma = 1, A = 1, b = 0, \Omega = 1$, we have that $$\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \sim \mathcal{N} \left( \begin{pmatrix} 0 \\ 0 \end{pmatrix}, \begin{pmatrix} 1 & 1 \\ 1 & 2 \end{pmatrix} \right).$$ Therefore, $x_2 \sim \mathcal{N} (0, 2)$. Using a bit of algebra, you'll find that $$p(x_1 \,|\, x_2) = \frac{ \mathcal{N} \left( x_2; x_1, 1 \right) \cdot \mathcal{N} \left( x_1; 0, 1 \right) }{ \mathcal{N} \left( x_2; 0, 2\right) } = \mathcal{N} \left( x_1; \frac{x_2}{2}, \frac12 \right)$$ which implies that $$x_1 \,|\, x_2 \sim \mathcal{N} \left( \frac{x_2}{2}, \frac12 \right).$$
Below is another result that can be used, along with result #1, to answer your question without working directly with pdfs.
Result #2: If random variables $x \in \mathbb R^n$ and $y \in \mathbb R^m$ have the joint Gaussian distribution \begin{equation} \begin{pmatrix} x \\ y \end{pmatrix} \sim \mathcal N \left( \begin{pmatrix} a \\ b \end{pmatrix}, \begin{pmatrix} A & C \\ C^{\top} & B \end{pmatrix} \right), \end{equation} then the conditional distributions of $x$ and $y$ are given as $$x \,|\, y \sim \mathcal N \left( a + C B^{-1} (y - b), A - C B^{-1} C^{\top} \right)$$ $$y \,|\, x \sim \mathcal N \left( b + C^{\top} A^{-1} (x - a), B - C^{\top} A^{-1} C \right)$$ provided $A$ and $B$ are invertible.
From result #1, we know that the joint distribution of $x_1$ and $x_2$ is $$\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \sim \mathcal{N} \left( \begin{pmatrix} 0 \\ 0 \end{pmatrix}, \begin{pmatrix} 1 & 1 \\ 1 & 2 \end{pmatrix} \right).$$ Using result #2, with $a=0, b=0, A=1, B=2, C=1$, we see that the conditional distribution of $x_1$ given $x_2$ is $$x_2 \,|\, x_1 \sim \mathcal{N} \left( \frac{x_2}{2}, \frac12 \right)$$