I am having trouble with Bayesian statistics and would like to understand it a little more.
And I would like to ask if the following is true.
Suppose observations $X_1, ... , X_n \sim_{iid} f(x;\theta)$ and prior of the parameter $\theta$ is a random variable $\Theta \sim \pi(\theta)$.
I know that by definition the posterior distribution of $\theta$ is
$$k(\theta|x_1,...,x_n)= \frac{f(\theta|x_1,...,x_n)\pi(\theta)}{\int_{-\infty}^{\infty}f(\theta|x_1,...,x_n)\pi(\theta)d\theta}$$ *(when continuous)
Since the Xs are iid, I want to say that the likelihood function can be substituted in instead of $f(\theta|x_1,...,x_n)$ as such
$$k(\theta|x_1,...,x_n)=\frac{ L(\theta)\pi(\theta)}{\int_{-\infty}^{\infty}L(\theta)\pi(\theta)d\theta}$$
Is this an acceptable notation or is it incorrect?
There aren't many examples that I am able to understand in my notes and I would really like to overcome the fear of Bayesian statistics.
I appreciate your help.
You say:
$$k(\theta|x_1,...,x_n)= \frac{f(\theta|x_1,...,x_n)\pi(\theta)}{\int_{-\infty}^{\infty}f(\theta|x_1,...,x_n)\pi(\theta)d\theta}$$
But this is not the definition. In actual fact it is:
$$k(\theta|x_1,...,x_n)= \frac{f(x_1,...,x_n|\theta)\pi(\theta)}{\int_{-\infty}^{\infty}f(x_1,...,x_n|\theta)\pi(\theta)d\theta}$$
Where $f(x_1,...,x_n|\theta)$ is the likelihood of obtaining the data given the parameter value $\theta$. The notation here is really the same as saying $L(x_1,...,x_n|\theta)$, it is just different notation. When looking at maximum likelihood estimates, we maximise our likelihood function under the parameter $\theta$. In this case we write $L(\theta)$ in place of $L(x_1,...,x_n|\theta)$. Again, just notation.