Simplification step in derivation of posterior pdf using Bayes' rule

48 Views Asked by At

I am working through Intro. to Probability by Bertsekas and Tsitsiklis and there is an example involving the following prior distribution.

We have $X\sim \text{Uniform}[0,\theta]$ and $\theta$ is an unknown parameter modeled as a random variable $\Theta\sim \text{Uniform}[0,1]$.

We have $f_{\Theta}(\theta) = 1$ if $0\leq \theta \leq 1$ and $f_{\Theta}(\theta) = 0$ otherwise. We also have $f_{X\vert \Theta}(x\vert \theta)= \frac{1}{\theta}$ if $0\leq x \leq \theta$, and $f_{X\vert \Theta}(x\vert \theta)= 0$ otherwise.

Now, I understand how the posterior is setup and the following steps make sense:

\begin{align*} f_{\Theta\vert X}(\theta \vert x) &= \frac{f_{\Theta}(\theta)f_{X\vert \Theta}(x\vert \theta)}{\int_0^1 f_\Theta(\theta')f_{X\vert \Theta}(x\vert \theta')d\theta'}\\ &=\frac{\frac{1}{\theta}}{\int_x^1 \frac{1}{\theta'}d\theta'} \end{align*}

Now the next step does not make sense

$$\frac{\frac{1}{\theta}}{\int_x^1 \frac{1}{\theta'}d\theta'}$$

gets simplified to

$$\frac{1}{\theta \cdot \vert{\log{x}}\vert}$$ for $x \leq \theta \leq 1$.

When I simplify the integral in the denominator I get

$$\frac{1}{\theta \cdot -\log{x}}.$$

I understand that probabilities cannot be negative, but how are we allowed to take the liberty of simply making the log an absolute value?

1

There are 1 best solutions below

0
On BEST ANSWER

\begin{align*} \int_x^1 \theta^{-1}d\theta & = \log(\theta)\, \big|_{x}^1 \\ & = \big(\log(1) - \log(x) \big) \\ & = 0 - \log(x) \\ & = |\log(x)| \end{align*}

where the final line is true because $\log(x) < 0$ is guaranteed for $0 < x \leq 1$, and hence $-\log(x)$ is positive. From there it is simply that for any positive real $z$ we have $z = |z|$.