Bayesian Estimator and Markov Chains

76 Views Asked by At

This is Exercise 6.1.14 from Dembo's notes found here. At this point, we are just beginning a discussion of Markov chains. I have no prior experience with estimators and so I am a bit lost with this problem. The problem is as follows:

Let $\theta$ and $(U_k)_k$ be independent and uniformly distributed on $(0,1)$. Let $X_k = sgn(\theta - U_k)$ and $S_n = \sum_{k=1}^n X_k$.

(a) Compute $\mathbb{P} [ X_{n+1} = 1 \mid X_1, \ldots, X_n]$

(b) Show that $(S_n)_n$ is a Markov chain. Is it homogeneous?

I am having trouble with part (a) (I'm guessing part (b) will be not so difficult given part (a), but I could be wrong).

I thought I had a semi-sensible attempt at a solution, but while writing it out here, I realized it actually doesn't make sense. So I am know back at square 1, and any help would be much appreciated.

Thanks in advance.

1

There are 1 best solutions below

4
On BEST ANSWER

Hints for (a):

  • With $\theta$ having any distribution on $(0,1)$ and $U_{n+1}$ independently uniformly distributed on $(0,1)$, you have $\mathbb{P}[X_{n+1}=1]=\mathbb{P}[U_{n+1} \lt \theta]=\mathbb{E}[\theta]$
  • You can find the posterior distribution for $\theta$ in the usual way: the prior density for $\theta$ multiplied by the likelihood of the observed data, divided by the integral over $\theta$ to give a probability density ...
  • ... and so you can find the posterior expectation of $\theta$