Computing the log-likelihood term

67 Views Asked by At

I´m currently reading the paper "The Little Engine that Could Regularization by Denoising (RED)" by Yaniv Romano and Micheal Elad. In the beginning they come up with solving the Maximum aposteriori Probability (MAP) for an given (measured) image $y$ and a unknown image $x$. So one tries to solve ${argmax}_x P(x\vert y)$ which is equivalent to solving ${argmin}_x -log(P(y\vert x)) -log(P(x))$. The term $l(y,x)=-log(P(y\vert x))$ is also know as the log-likelihood term. Then there is written that $l(y,x)=\frac{1}{2\sigma^2}\vert\vert Hx-y\vert{\vert}_2^2$ if $y=Hx+e$, where $e$ is a gaussian noise with variance $\sigma^2$, and $H$ is a linear Operator. This is what I don´t get. Can anybody help me to compute the log-likeliehood term $l(y,x)$ for the case that $y=Hx+e$?

Thanking you in anticipation,

Christian

1

There are 1 best solutions below

0
On BEST ANSWER

The prior assumption is that $y=Hx + e$, where $e$ follows a gaussian distribution with mean $0$ and covariance $\sigma^2I$. Let me restate this using the notation $p(e) = \mathcal{N}(e|0, \sigma^2I)$.

This assumption implies that $p(y|x)=\mathcal{N}(y|Hx, \sigma^2)$ (note that we are considering a conditional distribution where $x$ is given, so we may see $x$ as deterministic).

Therefore, assuming the vectors $x$ and $y$ are $k$-dimensional,

\begin{align} -\log p(y|x) &= -\log \left(\frac{1}{\sigma \sqrt{(2\pi)^k}} \exp\left(-\frac{1}{2}(y-Hx)'\sigma^{-2}I(y-Hx) \right) \right) \\ &= \frac{1}{2\sigma^2}||y-Hx||^2 + \text{const.} \end{align}

Assuming $\sigma^2$ is fixed, the "const." is a constant and, therefore, does not affect the arg min.