So suppose I have a coin that has a probability $\mu$ of landing on heads, and $1-\mu$ of landing on tails.
I am giving the prior distribution $\mu$ ~ Uniform[0,1], and my realization $D_1 = \{H,T\}$.
I was wondering how I could find the posterior distribution $p(\mu | D)$? I was thinking of applying Bayes rule which gives me $p(D_1 | \mu) p(\mu)$ over $p(D_1)$, but I have no idea how to find $p(D_1)$. After some calculation, I know that p(H) = 1/2, so I thought $p(D_1 | \mu) = 1/2 * 1/2 = 1/4$. Is this correct?
I think I figured it out. Essentially, using the law of total probability I can expand P(D1) with an integral, and then since $\mu$ is uniformly distributed, $p(\mu)$ is 1. In addition, $p(D_1 | \mu)$ is $\mu * (1-\mu)$ since $\mu$ is the probability of heads.
Using those above facts, I can solve for $p(\mu | D)$. The idea I was missing was that $\mu$ is a random variable that isn't fixed, since this is Bayesian inference.