I'm trying to sample $\beta_0$ and $\beta_1$ from a set of data modelled by a Bernoulli distribution.
\begin{equation}
p(y)=θ^{y}(1−θ)^{1−y}
\end{equation}
Where the relationship between $y$ and my primary $x_i$ is modelled by a logistic regression.
\begin{equation}
\nu = log(\frac{\theta}{1-\theta}) = \beta_0 + \beta_1x_i \
\end{equation}
And my prior is:
\begin{equation}
\pi(\beta_0,\beta_1) \propto 1
\end{equation}
I've found my posterior $\pi(\beta_0,\beta_1|y)$:
\begin{equation}
\pi(\beta_0,\beta_1|y) = f(y|\beta_0,\beta_1)\pi(\beta_0,\beta_1) \\
\pi(\beta_0,\beta_1|y) = (\frac{e^{\beta_0+\beta_1x_i}}{1+e^{\beta_0+\beta_1x_i}})^{\sum_{i=1}^ny_i}(\frac{1}{1+e^{\beta_0+\beta_1x_i}})^{n-\sum_{i=1}^ny_i}
\end{equation}
So first I'm trying to sample $\beta_0$ and $\beta_1$ separately, so sampling from $\pi(\beta_0|y,\beta_1)$ then from $\pi(\beta_1|y,\beta_0)$. But since $\pi(\beta_0,\beta_1) \propto 1$, would those posteriors be the same thing and I'd be sampling from the same posterior twice? Or am I missing something? Would I just sub in $\nu$ instead of $\beta_0 + \beta_1x_i$ in the posterior written above and sample from that?
Any help would be greatly appreciated.
No, they are not the same thing. You sample from
$$\beta_0^{(n*)}\sim\pi(\beta_0|y,\beta_1^{(n-1)})$$
You then calculate the acceptance probability $\alpha$, and generate a random uniform random variable $U\sim Unif(0,1)$. If $U<\alpha$, you set $\beta_0^{(n)}=\beta_0^{(n*)}$, the proposed value; otherwise, the $\beta_0^{(n)}=\beta_0^{(n-1)}$ does not change from the previous state.
Then using this new value from the iteration, you do the same for $\beta_1$: Generate:
$$\beta_1^{(n*)}\sim\pi(\beta_1|y,\beta_0^{(n)})$$
Do the same thing with the acceptance probability, and this counts as one iteration of the algorithm. So you get $(\beta_0^{(n)}, \beta_1^{(n)})$. Notice that since you used the new value of $\beta_0^{(n)}$ in the conditional for $\beta_1^{(n*)}$, this isn't just the same as generating the same distribution for each of them.