Prove that the error covariance matrix of the Recursive Least Squares Estimation is positive definite

70 Views Asked by At

A linear recursive estimator can be written in the following form: $$y_k=H_kx+v_k$$ $$\hat{x}_k=\hat{x}_{k-1}+K_k(y_k-H_k\hat{x}_{k-1})$$ Where $H_k$ is an $m\times{}n$ matrix, $K_k$ is $n\times{}m$ and is referred as the estimator matrix gain. In addition, $y_k-H_k\hat{x}_{k-1}$ is the correction term of our estimate and modifies our past estimate $\hat{x}_{k-1}$ to $\hat{x}$. $v_k$ is the noise on our meassurement with mean equal to zero.

The current estimation error is: $\epsilon{}_k=x-\hat{x}_k$ and after substituting $\hat{x}_k$ and some simplifications in the expresion we get: $$\epsilon_k=(I-K_kH_k)\epsilon_{k-1}-K_kv_k$$ Where $I$ is the $n\times{}n$ identity matrix.

Next, we define the estimation-error covariance $n\times{}n$ matrix $P_k=E[\epsilon_k\epsilon_{k}^T]$and the noise covariance $m\times{}m$ matrix $R_k=E[v_kv_{k}^T]$. Moreover, substituting the definition of $\epsilon_k$ in our $P_k$ definition and after some algebraic simplifications we get:

$$P_k=(I-K_kH_k)P_{k-1}(I-K_kH_k)^T+K_kR_kK_k^T$$ We have to prove that $P_k$ is always positive definite if $P_{k-1}$ and $R_k$ are positive definite.

1

There are 1 best solutions below

0
On BEST ANSWER

If the matrix is $I-K_kH_k$ is invertible, then the result follows from the fact that the matrix $(I-K_kH_k)P_{k-1}(I-K_kH_k)^T$ is congruent to $P_{k-1}$. So, $P_{k-1}$ positive definite is equivalent to $(I-K_kH_k)P_{k-1}(I-K_kH_k)^T$ positive definite.

Now assume that $I-K_kH_k$ is singular and let $\ell_k$ be the algebraic multiplicity of the zero eigenvalue (which we assume to be equal to the geometric multiplicity). So, let $V\in\mathbb{R}^{n\times \ell_k}$, full rank, such that $(I-K_kH_k)^TV_k=0$ and $V_k^TV_k=I_{\ell_k}$. Let $U_k$ be such that $M_k:=[U_k\ V_k]$ is a basis of $\mathbb{R}^n$.

Then, we obtain that

$$M_k^TP_kM_k=\begin{bmatrix}U_k^T(I-K_kH_k)P_{k-1}(I-K_kH_k)^TU_k & 0\\0 & 0\end{bmatrix}+\begin{bmatrix}U_k^T\\V_k^T\end{bmatrix}K_kR_kK_k^T\begin{bmatrix} U_k & V_k\end{bmatrix}.$$

The term $U_k^T(I-K_kH_k)P_{k-1}(I-K_kH_k)^TU_k$ is positive definite while the second term is positive semidefinite, so they need to compensate each other to make the sum positive definite. A necessary condition is that $V_k^TK_kR_kK_k^TV_k$ is positive definite.

To prove the sufficiency assume that $V_k^TK_kR_kK_k^TV_k$ is positive definite. Therefore, from a Schur complement argument, we have that

$$U_k^TD_kU_k-U_k^TD_kV_k(V_k^TD_kV_k)^{-1}V_k^TD_kU_k$$ is positive semidefinite where $D_k:=K_kR_kK_k^T$. Now, pick the sum of the matrices and perform a Schur complement to get

$$U_k^T(I-K_kH_k)P_{k-1}(I-K_kH_k)^TU_k+U_k^TD_kU_k-U_k^TD_kV_k(V_k^TD_kV_k)^{-1}V_k^TD_kU_k.$$

As the first term is positive definite and the second one is positive semidefinite, then the sum is positive definite, which proves that $P_k$ is positive definite if and only if $V_k^TK_kR_kK_k^TV_k$ is positive definite.