Likelihood: $f(x^T, n^T|\theta^T) = \prod_{i=1}^{30} \binom{n^T_i}{x^T_i}{\theta^T}^{x^T_i}{(1-\theta^T)}^{n^T_i-x^T_i}$
Prior: $ log(\frac{\theta^T}{1-\theta^T})\sim N(\mu_T,\sigma_T^2) $
I am required to write a MCMC script with the single site random walk metropolis scheme in R based on the above.
My questions:
How can I get the joint posterior from the given distributions?
Is it possible to find the distribution of $\theta^T$ ?
One is given the likelihood, that is, the distribution of $(X,N)$ conditionally on $T$. The prior, that is, the distribution of $T$, is characterized by the distribution of $R=\log(T/(1-T))$. The posterior is the distribution of $(X,N,T)$.
Let $p$, $f$, $g$, and $h$ denote the densities of the posterior, the likelihood, $T$, and $R$ respectively. Given $f(x,n\mid \theta)$ and $h(r)$, one asks for $p(x,n,\theta)$ and $g(\theta)$ (more rigorously, given $f$ and $h$, one asks for $p$ and $g$). By definition, $$ p(x,n,\theta)=f(x,n\mid\theta)g(\theta), $$ hence it remains to compute $g$. Recall that, again by definition, $g$ is characterized by the fact that, for every test function $u$, $$ E[u(T)]=\int u(\theta)g(\theta)\mathrm d\theta, $$ and that, still by definition, $h$ is characterized by the fact that, for every test function $v$, $$ E[v(R)]=\int v(r)h(r)\mathrm dr. $$ But $E[v(R)]=E[u(T)]$ with $u:\theta\mapsto v(\log(\theta/(1-\theta)))$ hence one looks for a density $g$ such that, for every test function $v$, $$ \int v(r)h(r)\mathrm dr=\int v(\log(\theta/(1-\theta))g(\theta)\mathrm d\theta. $$ The change of variable $r=\log(\theta/(1-\theta))$, $\mathrm dr=\mathrm d\theta/(\theta(1-\theta))$ yields $$ g(\theta)=\frac{h(\log(\theta/(1-\theta))}{\theta(1-\theta)}. $$ At this point, one might recognize a Jacobian and be reminded of the change of variable formula.