I am learning statistics myself. The question might be trivial but it's not a homework problem.
Let $n$ be a positive integer, and $k$ be a smaller one. Suppose $X_1, X_2, \cdots, X_n$ are IID Bernoulli random variable $B(\theta)$ with an unknown $\theta$. Let $X$ be $\frac{X_1 + \cdots X_n}{n}$. Suppose $k$ is the observed $X$. Then
$$P(X=\frac{k}{n}\,|\,\theta) = {n \choose k} \theta^k (1-\theta)^{n-k} \, [k \in \{0 \cdots n\}]. $$
Fix $a \in [0,1]$. I'm interested in calculating how likely the underlying model $\theta$ is $a$. I hope to do it by the Bayesian formula:
$$ P(\theta = a| X=k) = \frac{P(X=k|\theta = a) P(\theta=a)}{P(X=k)}. $$
However, I cannot think of what $P(X = k)$ and $P(\theta = a)$ are. Am I missing anything?
Your model is Bernoulli. Thus the likelihood you observe will be
$$p(\mathbf{x}|\theta)\propto \theta^k(1-\theta)^{n-k}$$
The posterior distribution is the following
$$\pi(\theta|\mathbf{x}) \propto\pi(\theta) p(\mathbf{x}|\theta)$$
To derive your posterior now you have some different choices to set your prior
$$\pi(\theta|\mathbf{x}) \propto \theta^k(1-\theta)^{n-k} \sim Beta(k+1;n-k+1)$$
That is
$$\pi(\theta|\mathbf{x})=\frac{\Gamma(n+2)}{\Gamma(k+1)\Gamma(n-k+1)}\theta^k(1-\theta)^{n-k}$$
$\theta \in [0;1]$
You have some information about the values of the parameter...it is more probabily that $\theta \in A \subset [0;1]$ so you can use a conjugate prior and do the same calcualtion as in 1.
You can use a non informative prior (EDIT: not exactly what Jeffrey proposed) that in this case can be in the form
$$\pi(\theta)\propto \frac{1}{\theta(1-\theta)}$$
and your posterior becomes
$$\pi(\theta|\mathbf{x})\propto\theta^{k-1}(1-\theta)^{n-k-1}$$
Which is always a beta with differen paramenters:
$$\pi(\theta|\mathbf{x})\sim Beta(k,n-k)$$
in this case, your posterior is a density only if $0<k<n$ because in the extremes situation of no successes or all successes, the posterior is not integrable