$\theta \sim \text{ Uniform}(0,1)$ and $X|\theta \sim \text{ Bernoulli}(\theta)$.
How would I find the posterior of $\theta$?
The likelihood of a Bernoulli is $p^{\sum{x_i}} (1-p)^{n-\sum x_i}$. How do I proceed? It is supposed to fit the beta distribution but I don't see how since it doesn't have a B-1
Probably you are given observations $X = \{X_1, \ldots, X_n\}$ with $X_i|\Theta = \theta \sim \text{ Bernoulli}(\theta)$, which gives the likelihood function of $\Theta$ as follows: $$\ell(x|\Theta = \theta) = \theta^{\sum_{i = 1}^n x_i}(1 - \theta)^{n - \sum_{i = 1}^n x_i} = \theta^t(1 - \theta)^{n - t},$$ where $t = \sum_{i = 1}^n x_i$.
The prior of $\Theta$, by condition has pdf $$\pi_\Theta(\theta) = I_{(0, 1)}(\theta).$$ Therefore the joint distribution of $X$ and $\theta$ is given by $$p(x, \theta) = \ell(x|\theta)\pi_\Theta(\theta) = \theta^t(1 - \theta)^{n - t}I_{(0, 1)}(\theta)$$ Notice by Bayesian theorem, the posterior distribution of $\Theta$ given $X = x$ is $$p_{\Theta|X}(\theta|x) = \frac{p(x, \theta)}{\int p(x, \theta) d\theta} \propto p(x, \theta). \tag{1}$$ where $\propto$ means two expressions differ only up to a constant which is independent of $\theta$.
It can be identified $p(x, \theta)$ is the pdf of $\text{Beta}(t + 1, n - t + 1)$ distribution, Thus by $(1)$, we claim that $$\Theta|X = x \sim \text{ Beta}\left(\sum x_i + 1, n - \sum x_i + 1\right).$$