Let $X=X_1,X_2,...,X_n$ be a random sample from the binomial distribution such that $X_i|\theta\sim Bin(m,\theta)$ with m known. If $\theta\sim p*Unif(0,1)+(1-p)*Beta(a,b)$ with $a,b,p$ known, I'd like find the posterior $\theta|X$. I know that I have to use the Bayes theroem but I don't know how to deal with the fact that the distribution of $\theta$ is a mixture.
2026-04-03 11:42:01.1775216521
Posterior distribution of binomial likelihood and mixture of beta and uniform prior
287 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in CONDITIONAL-PROBABILITY
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- Finding the conditional probability given the joint probability density function
- Easy conditional probability problem
- Conditional probability where the conditioning variable is continuous
- probability that the machine has its 3rd malfunction on the 5th day, given that the machine has not had three malfunctions in the first three days.
- Sum of conditional probabilities equals 1?
- Prove or disprove: If $X | U$ is independent of $Y | V$, then $E[XY|U,V] = E[X|U] \cdot E[Y|V]$.
- Conditional probability and binomial distribution
- Intuition behind conditional probabilty: $P(A|B)=P(B\cap A)/P(B)$
- Transition Probabilities in Discrete Time Markov Chain
Related Questions in BAYESIAN
- Obtain the conditional distributions from the full posterior distribution
- What it the posterior distribution $\mu| \sigma^2,x $
- Posterior: normal likelihood, uniform prior?
- If there are two siblings and you meet one of them and he is male, what is the probability that the other sibling is also male?
- Aggregating information and bayesian information
- Bayesian updating - likelihood
- Is my derivation for the maximum likelihood estimation for naive bayes correct?
- I don't understand where does the $\frac{k-1}{k}$ factor come from, in the probability mass function derived by Bayesian approach.
- How to interpret this bayesian inference formula
- How to prove inadmissibility of a decision rule?
Related Questions in BAYES-THEOREM
- Question to calculating probability
- Bayes' Theorem, what am I doing wrong?
- A question about defective DVD players and conditional probabaility.
- Is my derivation for the maximum likelihood estimation for naive bayes correct?
- 1 Biased Coin and 1 Fair Coin, probability of 3rd Head given first 2 tosses are head?
- Conditional Probability/Bayes Theory question
- Dependence of posterior probability on parameters
- Probability Question on Bayes' Theorem
- Coin probability
- What is the probability of an event to happen in future based on the past events?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $\bar{X}$ be the mean of the sample.
The posterior is a normalizing constant times
$$\left( p+(1-p) \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)}\theta^{a-1}(1-\theta)^{b-1} \right)\prod_{i=1}^{n}\left[ \theta^{X_i} (1-\theta)^{m-X_i} \right]$$ $$=p \theta^{n\bar{X}}(1-\theta)^{n(m-\bar{X})}+(1-p) \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)}\theta^{a+n\bar{X}-1}(1-\theta)^{b+n(m-\bar{X})-1}$$ $$=p \frac{\Gamma(n\bar{X}+1)\Gamma(n(m-\bar{X})+1)}{\Gamma(n\bar{X}+1+n(m-\bar{X})+1)} \frac{\Gamma(n\bar{X}+1+n(m-\bar{X})+1)}{\Gamma(n\bar{X}+1)\Gamma(n(m-\bar{X})+1)} \theta^{n\bar{X}+1-1}(1-\theta)^{n(m-\bar{X})+1-1}+(1-p) \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)} \frac{\Gamma(n\bar{X}+a)\Gamma(n(m-\bar{X})+b)}{\Gamma(n\bar{X}+a+n(m-\bar{X})+b)} \frac{\Gamma(n\bar{X}+a+n(m-\bar{X})+b)}{\Gamma(n\bar{X}+a)\Gamma(n(m-\bar{X})+b)} \theta^{a+n\bar{X}-1}(1-\theta)^{b+n(m-\bar{X})-1}$$
This is a mixture of two $Beta$ random variables.
The first is $Beta(n\bar{X}+1,n(m-\bar{X})+1)$ and the second is $Beta(n\bar{X}+a,n(m-\bar{X})+b)$.
The mixture probabilities (weights) are proportional to $$p \frac{\Gamma(n\bar{X}+1)\Gamma(n(m-\bar{X})+1)}{\Gamma(n\bar{X}+1+n(m-\bar{X})+1)} =p \frac{\Gamma(n\bar{X}+1)\Gamma(n(m-\bar{X})+1)}{\Gamma(nm+2)}$$ and $$(1-p) \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)} \frac{\Gamma(n\bar{X}+a)\Gamma(n(m-\bar{X})+b)}{\Gamma(n\bar{X}+a+n(m-\bar{X})+b)} =(1-p) \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)} \frac{\Gamma(n\bar{X}+a)\Gamma(n(m-\bar{X})+b)}{\Gamma(nm+a+b)}$$
The weights have to be multiplied by a constant so they add up to 1; i.e. divide each of them by their sum to find the normalized weights.