MLE of an increasing nonegative signal from convex optimization book

737 Views Asked by At

I need help solving this problem from Stephen Boyd's Convex Optimization additional exercise. Its question 6.6 from additional exercise.

Maximum likelihood estimation of an increasing nonnegative signal. We wish to estimate a scalar signal $x(t)$, for $t=1,2,...,N,$ which is known to be nonnegative and monotonically nondecreasing:

\begin{equation*} 0 \le x(1) \le x(2) \le ... \le x(N) \end{equation*}

This occurs in many practical. For example, $x(t)$ might be a measure of wear or deterioration, that can only get worse, or stay the same, as time $t$ increases. We are also given that $x(t) = 0$ for $t \le 0$.

We are given a noise-corrupted moving average of $x$, given by

\begin{equation*} y(t) = \sum_{\tau = 1}^{k} h(\tau)x(t-\tau) + v(t), \quad t = 2,...,N+1 \end{equation*}

where $v(t)$ are independent $N(0,1)$ random variables.

QUESTION

Show how to formulate the problem of finding the maximum likelihood estimate of $x$, given $y$, taking into account the prior assumption that $x$ is nonnegative and monotonically nondecreasing, as a convex optimization problem. Be sure to indicate what the problem variables are, and what the problem data are.

1

There are 1 best solutions below

1
On BEST ANSWER

The problem states that you are given $y$. The moving average filter $h$ is known in advance as well. So that is your problem data. You are seeking to find $x$, so that clearly must be your problem variable. But in fact, it can be helpful to allow $v$ to be variable as well.

Because the noise values $v(t)$ are i.i.d. zero-mean Gaussians, the standard, unconstrained maximum likelihood solution that minimizes the sum of the squares of $v(t)$. That is, \begin{array}{ll} \text{minimize} & \sum_{t=2}^{N+1} v(t)^2 \\ \text{subject to} & y(t) = \sum_{\tau=1}^k h(\tau) x(t-\tau) + v(t), ~ t=2,\dots,N+1 \end{array} All you have to add to that is a set of constraints that express your knowledge about $x$. That gives us \begin{array}{ll} \text{minimize} & \sum_{t=2}^{N+1} v(t)^2 \\ \text{subject to} & y(t) = \sum_{\tau=1}^k h(\tau) x(t-\tau) + v(t), ~ t=2,\dots,N+1 \\ & x(N) \geq x(N-1) \geq \dots \geq x(1) \geq 0 \\ & x(t) = 0, ~ t\leq 0 \end{array} It really is that simple. But we can eliminate $v$ if we want. We can also eliminate any mention of $t\leq 0$ if we tweak that convolution summation a bit: \begin{array}{ll} \text{minimize} & \sum_{t=2}^{N+1} \left( y(t) - \sum_{\tau=1}^{\min\{k,t-1\}} h(\tau) x(t-\tau) \right)^2\\ \text{subject to} & x(N) \geq x(N-1) \geq \dots \geq x(1) \geq 0 \end{array}