Optimizing Quadratic Likelihood

16 Views Asked by At

I'm trying to implement a parameter estimation method that includes finding some polynomial

$$b(z) = z^M + b_1z^{M-1} + \dots + b_M$$

To find these polynomial coefficients, one has to solve:

$$\hat{\bf{b}} = \underset{{{\bf b}}}{\operatorname{arg~min}} \operatorname{Tr} \left\{\boldsymbol{B}{(\boldsymbol{B}^H\boldsymbol{B})}^{-1}\boldsymbol{B}^H\hat{\boldsymbol{R}} \right\}$$

Where $\hat{\boldsymbol{R}}$ is a covariance matrix and $\boldsymbol{B}$ is of the form:

$$\boldsymbol{B}=\begin{bmatrix} b_M & b_{M-1} & \cdots & 1 & \cdots & 0 \\ & \ddots & \ddots & &\ddots \\ 0 & & b_M & b_{M-1} & \dots & 1 \end{bmatrix}$$

built from the elements of the vector $\boldsymbol{b}$ being estimated. The paper calls this a quadratic problem, which I took to be related to a quadratic programming optimization problem. But, to me, the structure of this problem seems very different from the structure of a quadratic programming problem. Is there any known algorithm to optimize this cost function or does one needs to set some bounds on the polynomial coefficients and do a search over the entire $M$-dimensional space?