Conditional expectation and optimization

134 Views Asked by At

Let $x$ and $z$ be two real random variables with unknown joint distribution but satisfying the following relations: \begin{equation} E\{x|z\}=\mu_0 \triangleq \mu(\boldsymbol{\theta}_0,z), \end{equation} \begin{equation} var\{x|z\}=s_0 \triangleq s(\boldsymbol{\theta}_0,z), \end{equation} for some $\boldsymbol{\theta}_0 \in \mathbb{R}^p$. Let us now define the following function \begin{equation} l(\boldsymbol{\theta};x,z) \triangleq 2^{-1}\ln 2 \pi s(\boldsymbol{\theta},z) + \frac{(x-\mu(\boldsymbol{\theta},z))^2}{2s(\boldsymbol{\theta},z)}. \end{equation}

I should show that $\bar{l}(\boldsymbol{\theta})\triangleq E_{x,z}\{l(\boldsymbol{\theta};x,z)\}$ is uniquely minimized by $\boldsymbol{\theta}_0$.

Any suggestion?

1

There are 1 best solutions below

5
On

This seems like a very roundabout way of saying that

$$ \mathsf E\left[\ln b+\frac{(x-a)^2}b\right] $$

is minimized by $a=\mathsf E[x]$ and $b=\mathsf{Var}(x)$, which follows immediately by setting the derivative with respect to $a$ and $b$ to zero.