Show that minimizing $Tr(Q)$ equals minimizing $x_0^{T}\:Q\:x_0$

112 Views Asked by At

In two different textbooks about Kalman Filter, the so-called Estimator Gain Matrix G is obtained as result of two different minimization problems, i would like to show or at least giustify that the two problems have the same minimizer G.

$$arg\underset{G\,\in\,\mathbb{R}^{n\times m}}{min} \;\; Tr(Q)$$

$$arg \underset{G\,\in\,\mathbb{R}^{n\times m}}{min} \;\; x_0^{T}\:Q\:x_0$$

where $$Q=\int_{0}^{t} \,e^{(A+GC)\beta}\,(\Psi+G\Gamma G^{T})\:e^{(A+GC)^{T}\beta}\,d\beta$$

with

$\;\;\;\Psi$ $[n\times n]$ diagonal matrix

$\;\;\;\Gamma$ $[m\times m]$ diagonal matrix

$\;\;\;G$ $[n\times m]$ matrix

$\;\;\;A$ $[n\times n]$ matrix

$\;\;\;C$ $[m\times n]$ matrix

$\;\;\;x_0 \in \mathbb{R^n} \;\;s.t. \;x_0 \neq0$

$\;\;\;A, G, C \;s.t. (A+GC)$ negative definite matrix

$\;\;\;\beta \in \mathbb{R} \,,\:t \in \mathbb{R}$

Any help is appreciated, thanks.

1

There are 1 best solutions below

0
On

For the first optimization problem one can use that the trace of a matrix is equal to the sum of all its eigenvalues. But the second cost function has the following lower and upper bound

$$ \lambda_\min(Q)\,\|x_0\|^2 \leq x_0^\top Q\,x_0 \leq \lambda_\max(Q)\,\|x_0\|^2, $$

with $\lambda_\min(Q)$ and $\lambda_\max(Q)$ the smallest and biggest eigenvalue of $Q$ respectively. So these cost functions are not the same and the second cost function would be dependent on the choice for $x_0$. I think only when $\lambda_\min(Q) = \lambda_\max(Q)$ then the cost function would be identical independent of $x_0$, but I do not see a reason why this would be the case given the definition of $Q$.

It can be noted though that $Q$ for a given $G$ has to satisfy the following Lyapunov equation

$$ \mathcal{A}\,Q + Q\,\mathcal{A}^\top = e^{\mathcal{A}\,t} \left(\Psi + G\,\Gamma\,G^\top\right) e^{\mathcal{A}^\top t} - \Psi - G\,\Gamma\,G^\top, $$

with $\mathcal{A} = A + G\,C$.