I am trying something very simple to try to get more "inside" Bayesian statistics but the results are a bit odd.
Lets say I have an infinite bowl, inside of which are black and white balls. We do not know the ratio of blacks to whites, but we assume that the probability of black is $1/2$ and I have as the prior distribution the uniform distribution between $0$ and $1$, ie. $P(\theta)=1, \ \forall \theta$.
Lets assume that I pick one ball and it is black.
Now, I wish to get the posterior. $P(\theta|x)=\frac{P(x|\theta) P(\theta)}{\int_{\theta}P(x|\theta)P(\theta) d\theta}.$ What is the probability of one black if probability of black is $1/2$? It is $1/2$. Prior was $1$. In the integral, prior was $1$ and the probability of black is the same as the parameter, so it is $\int_0^1 \theta d\theta = 1/2$. So $P(\theta|x)=\frac{1/2*1}{1/2}=1.$
So if I got this right, getting a single black didn't change my belief about the distribution at all.
What if I get two blacks?
Now $P(x|\theta) = 1/4$, $P(\theta) = 1$ and the integral would be $\int_0^1 \theta^2 d\theta = 1/3$, so $P(\theta|x) = 1/4/1/3 = 3/4.$ What does this mean? How can that be a distribution? Is it the probability that the probability of there being half blacks is $3/4$? Or does it mean that the probability of black is $3/4$?
First things first: The statement "we assume that the probability of black is 1/2 and I have as the prior distribution the uniform distribution between 0 and 1" is self-contradictory. Either you assume a uniform prior, or you assume certainty about the parameter. I'll take that as you assume a uniform prior. It The proper bayesian model for your situation is the "beta-binomial" model, since the uniform is a special case of the Beta distribution:
$$ \theta \sim \operatorname{Beta}(\alpha,\beta) $$
$$ X|\theta \sim \operatorname{Bin}(n,\theta) $$
in this model, $\theta$ is the ratio of black/white balls, $\alpha$ and $\beta$ are hyperparameters (set both to 1 to get your uniform), and $X$ is the number of black balls in a sample of size $n$ randomly taken from your bowl.
Due to conjugacy, the posterior (i.e., the distribution of $\theta|x$) is $\operatorname{Beta}(\alpha+x,\beta+n-x)$. You can check this simply by multiplying the beta density and the binomial probability function. (And if you do that smartly, you'll see you don't need to evaluate the integral that appears in your denominator.)
Whatever the result is, the posterior will always be different from the prior. Even with a single black ball ($X=1$) from a sample of size $n=1$. In this case, using the uniform prior, the posterior is $\operatorname{Beta}(2,1)$. The posterior mean is $2/3$, not 1/2.
Your statement "the probability of black is the same as the parameter" and many others simply do not make sense. I hope this micro-introduction helps you getting started.