You have a coin and your prior assumption is that its probability of heads $\theta$ is chosen from a uniform distribution on $[0, 1]$. You toss the coin 10 times and get 6 heads. What is the estimate of $\theta$?
I figured that it has to be $\frac{6}{10}$ but is there a theorem or rule that can upend my guess?
You stated "Bayes" in the title of your question; therefore, the posterior estimate of $\theta$ is not a single value, but a distribution.
With a binomial likelihood, the beta distribution is a conjugate prior. That is to say, if $$\theta \sim \operatorname{Beta}(a,b),$$ and $$X \mid \theta \sim \operatorname{Binomial}(n, \theta),$$ the posterior density is $$\theta \mid X \sim \operatorname{Beta}(a+X, b+n-X).$$ In your case, a uniform prior corresponds to the hyperparameters $a = b = 1$, and we observed for $n = 10$ the result $X = 6$. Hence the posterior for $\theta$ is beta distributed with posterior hyperparameters $a^* = 1+6 = 7$ and $b^* = 1+10-6 = 5$, and has density $$f_{\theta \mid X}(\theta) = 2310 \, \theta^6 (1-\theta)^4 \mathbb 1 (0 < \theta < 1).$$
The mode of this posterior occurs at $\hat \theta = 3/5 = 0.6$, which is easily found by differentiation. This is also the frequentist maximum likelihood estimator (MLE). However, it is by no means the only meaningful point estimate that can be constructed from the posterior density; e.g., one could consider the expectation, which would be $$\operatorname{E}[\theta \mid X] = \frac{a^*}{a^* + b^*} = \frac{7}{12}.$$ Since you do not specify what type of point estimate you wish to construct, or even whether you want a point estimate at all (you could be intending to construct an interval estimate), it is perhaps best that, in the Bayesian context of the question, we stop at the computation of the posterior density.