I have seen in book (Statistical Rethinking) this equation: $$ Pr(w) = E(Pr(w|p)) = \int Pr(w|p)Pr(p)dp $$ Where $$ Pr(w, p) $$ is join probably density function. Can somebody explain me the equality between expected value of this conditional probability and marginal density?
2026-03-25 01:14:04.1774401244
Relationship between conditional expectation and marginal join?
853 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in BAYESIAN
- Obtain the conditional distributions from the full posterior distribution
- What it the posterior distribution $\mu| \sigma^2,x $
- Posterior: normal likelihood, uniform prior?
- If there are two siblings and you meet one of them and he is male, what is the probability that the other sibling is also male?
- Aggregating information and bayesian information
- Bayesian updating - likelihood
- Is my derivation for the maximum likelihood estimation for naive bayes correct?
- I don't understand where does the $\frac{k-1}{k}$ factor come from, in the probability mass function derived by Bayesian approach.
- How to interpret this bayesian inference formula
- How to prove inadmissibility of a decision rule?
Related Questions in EXPECTED-VALUE
- Show that $\operatorname{Cov}(X,X^2)=0$ if X is a continuous random variable with symmetric distribution around the origin
- prove that $E(Y) = 0$ if $X$ is a random variable and $Y = x- E(x)$
- Limit of the expectation in Galton-Watson-process using a Martingale
- Determine if an Estimator is Biased (Unusual Expectation Expression)
- Why are negative constants removed from variance?
- How to find $\mathbb{E}(X\mid\mathbf{1}_{X<Y})$ where $X,Y$ are i.i.d exponential variables?
- $X_1,X_2,X_3 \sim^{\text{i.i.d}} R(0,1)$. Find $E(\frac{X_1+X_2}{X_1+X_2+X_3})$
- How to calculate the conditional mean of $E(X\mid X<Y)$?
- Let X be a geometric random variable, show that $E[X(X-1)...(X-r+1)] = \frac{r!(1-p)^r}{p^r}$
- Taylor expansion of expectation in financial modelling problem
Related Questions in MARGINAL-DISTRIBUTION
- Marginal p.m.f. of two random variables with joint p.m.f. $p(x,y) = 2^{-x-y}$
- Joint distribution probability
- Marginal probability density function from joint distribution
- Finding pdf of $X$ when $(X,Y)$ is jointly uniform over $\{(x,y)\in\mathbb R^2 :0\leq x\leq1\,,0\leq y\leq\min(x,1−x)\}$
- Finding marginal distribution under change of variable of pdf
- Can we find the Joint Distribution of a random vector when we know the marginals of each random variable and the correlation matrix?
- Compute the PDF of $X$ if $(X,Y)$ is uniformly distributed over the unit disk
- calculate marginal PDF from joint PDF of dependent random variables
- Uniform distribution on the unit circle
- difference of Bayesian inference using marginal and conditional distribution of multinomial model.
Related Questions in MARGINAL-PROBABILITY
- Joint distribution probability
- Finding marginal distribution under change of variable of pdf
- Conditional normality implies marginal normality or bivariate normality?
- Integration help for marginal probability density problem
- Notation of marginal probabilities
- limits of marginal probability of polar coordinate probability
- Independence of two disjoint sums of independent Random variables
- Deriving marginal likelihood formula
- Limits of Integration When Finding Marginal PDF
- Continuous Conditional Probability Question
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $W,P$ be random variables with joint PDF $f_{W,P}\left(w,p\right)$.
The marginal PDF's can be found as: $$f_{W}\left(w\right)=\int f_{W,P}\left(w,p\right)dp\text{ | and | }f_{P}\left(p\right)=\int f_{W,P}\left(w,p\right)dw$$
For a fixed $p$ and under suitable conditions we can define a conditional PDF: $$f_{W}\left(w\mid p\right)=\frac{f_{W,P}\left(w,p\right)}{f_{P}\left(p\right)}$$
So we have the equality: $$f_{W,P}\left(w,p\right)=f_{W}\left(w\mid p\right)f_{P}\left(p\right)$$ Taking integrals on both side we find: $$f_{W}\left(w\right)=\int f_{W,P}\left(w,p\right)dp=\int f_{W}\left(w\mid p\right)f_{P}\left(p\right)dp$$
Here $\int f_{W}\left(w\mid p\right)f_{P}\left(p\right)dp$ can be recognized as $\mathbb{E}f_{W}\left(w\mid P\right)$ justifying that:$$f_{W}\left(w\right)=\mathbb{E}f_{W}\left(w\mid P\right)=\int f_{W}\left(w\mid p\right)f_{P}\left(p\right)dp$$
So in the notation used in your question:$$\Pr(w)=\mathbb E\Pr(w\mid P)=\int\Pr(w\mid p)\Pr(p)dp$$
The only difference is in the fact that I maintain a capital $P$ in the notation of the expectation. This to avoid confusion between random variable $P$ and the values $p$ that this variable can take. Quite often neglecting this difference leads to confusion.