Infimization/Minimization of a matrix product (least-squares-like)

247 Views Asked by At

Guys I have got a problem of the form (where $x(t+1) = Ax(t) + Bu(t)$ and $Q = Q^T \geq 0$:

$$\inf_{u(t) = g(x(t))} \mathbb{E}\left[\begin{bmatrix}x(t+1) \\ u(t) \end{bmatrix}^T \underbrace{\begin{bmatrix} Q_x & Q_{x,u} \\ Q_{x,u}^T & Q_u \end{bmatrix}}_Q \begin{bmatrix} x(t+1) \\ u(t) \end{bmatrix}\right]$$

Which can be written as (writing $x(t) = x$ and $u(t) = u$ for the sake of clarity for the reader):

$$ \inf_{u(t) = g(x(t))} \begin{bmatrix}Ax + Bu \\ u \end{bmatrix}^T \begin{bmatrix} Q_x & Q_{x,u} \\ Q_{x,u}^T & Q_u \end{bmatrix} \begin{bmatrix} Ax+Bu \\ u \end{bmatrix} $$

$$ u^T \alpha u + x^T \beta u + u^T \beta^T x + x^T \gamma x$$ Where:

$$ \begin{cases} \alpha & = B^T Q_x B + B^T Q_{x,u} + Q_{x,u}^T B + Q_u > 0 \text{ which was given} \\ \beta &= A^T Q_x B + A^T Q_{x,u} \\ \gamma & = A^T Q_x A \end{cases}$$

Where now we can rewrite $$ u^T \alpha u + x^T \beta u + u^T \beta^T x + x^T \gamma x$$ into :

$$\begin{bmatrix}I & u^T \end{bmatrix} \underbrace{\begin{bmatrix} \gamma & x^T \beta \\ \beta^Tx & \alpha \end{bmatrix}}_{M} \begin{bmatrix} I \\ u \end{bmatrix} $$

Where we can now factorize $M$ with the help of Schur's complement:

$$ \begin{bmatrix} I & -x^T \beta \alpha^{-1} \\ 0 & I\end{bmatrix}\begin{bmatrix} \gamma & x^T \beta \\ \beta^Tx & \alpha \end{bmatrix}\begin{bmatrix} I & 0 \\ -\alpha^{-1}\beta^T x & I\end{bmatrix} = \begin{bmatrix}\gamma - x^T \beta \alpha^{-1} \beta^T x & 0 \\ 0 & \alpha \end{bmatrix}$$

Now I don't exactly know what to do, but I have followed an approach I found which solves least squares equations. By setting $\alpha\hat{u} = \beta^Tx$ (and since $\alpha > 0$ we can write $\hat{u} = \alpha^{-1}\beta ^T x$. we can rewrite $M$ into:

$$ M = \begin{bmatrix} \gamma & x^T \beta \\ \beta^Tx & \alpha \end{bmatrix} = \begin{bmatrix} I & -\hat{u}^T \\ 0 & I\end{bmatrix}\begin{bmatrix}\gamma - x^T \beta \hat{u} & 0 \\ 0 & \alpha \end{bmatrix}\begin{bmatrix} I & 0 \\ -\hat{u} & I\end{bmatrix}$$

Which gives us the opportunity to write:

$$\begin{bmatrix}I & u^T \end{bmatrix} \underbrace{\begin{bmatrix} \gamma & x^T \beta \\ \beta^Tx & \alpha \end{bmatrix}}_{M} \begin{bmatrix} I \\ u \end{bmatrix} = \begin{bmatrix} I & u^T-\hat{u}^T\end{bmatrix}\begin{bmatrix}\gamma - x^T \beta \hat{u} & 0 \\ 0 & \alpha \end{bmatrix}\begin{bmatrix}I \\ u-\hat{u} \end{bmatrix}$$

Is it true that if we now use $u = \alpha^{-1}\beta^Tx$ we end up with the minimalsolution of the form:

$$u^T \alpha u + x^T \beta u + u^T \beta^T x + x^T \gamma x = x^T \left( \beta \alpha^{-T} \beta^T + \beta \alpha^{-1} + \alpha^{-T}\beta^T + \gamma \right) x$$

I believe I'm a bit stuck here. If we should just choose an arbitrary $u = P x$ we end up with a similar type of equation which gives me the impression that the above expression isn't necessarily the minimal/infimal one.

$$ x^T \left(P^T\alpha P + \beta P + P^T \beta^T + \gamma\right) x$$

1

There are 1 best solutions below

0
On BEST ANSWER

If we start evaluating what the actual product is of the matrices we get (for ease of reading we take $x(t) = x$ and $u(t) = u$): $$ \begin{array}{rcl} &&\begin{pmatrix}Ax+Bu\\ u \end{pmatrix}^T\begin{pmatrix}Q_x & Q_{x,u} \\ Q_{x,u}^T & Q_{u}\end{pmatrix}\begin{pmatrix}Ax + Bu \\ u \end{pmatrix} \\ &=& x^TA^TQ_xAx + x^TA^TQ_x Bu + u^TB^TQ_xAx + u^TB^TQ_x B u + \\ && x^T A^T Q_{x,u}u + u^T B^T Q_{x,u} u + u^TQ_{x,u}^T Ax + u^T Q_{x,u}^TBu + u^TQ_u u \\ &=& u^T \left[B^TQ_xB + B^TQ_{x,u} + Q_{x,u}^TB + Q_u \right] u + x^T\left[A^TQ_xB + A^TQ_{x,u}\right]u + \\ && u^T\left[B^TQ_xA+Q_{x,u}^T\right]x + x^T\left[A^TQ_xA\right]x \end{array} $$

This means that we end up with a infimization problem of the form:

\begin{multline} \label{eq:part1} \inf_{u(t) = g(x(t))} \begin{pmatrix} x \\ u\end{pmatrix}^T \underbrace{\begin{pmatrix} A^T Q_x A & A^T Q_x B + A^T Q_{x,u} \\ B^T Q_x A + Q_{x,u}^T & B^TQ_xB + B^TQ_{x,u} + Q_{x,u}^TB + Q_u \end{pmatrix}}_{M}\begin{pmatrix}x\\ u\end{pmatrix} = \\ \inf_{u(t) = g(x(t))} \begin{pmatrix} x \\ u\end{pmatrix}^T \underbrace{\begin{pmatrix} \alpha & \beta \\ \beta^T & \gamma \end{pmatrix}}_{M}\begin{pmatrix}x\\ u\end{pmatrix} \end{multline}

Now we can use the Schur complement on the matrix $M$ (it is given that $\gamma > 0$, thus we can calculate its inverse).

\begin{equation} \label{eq:Mschur} M = \begin{pmatrix}I & \beta \gamma^{-1} \\ 0 & I \end{pmatrix} \begin{pmatrix} \alpha - \beta \gamma^{-1} \beta^T& 0 \\ 0 & \gamma \end{pmatrix}\begin{pmatrix}I & 0 \\ \gamma^{-1}\beta^T & I \end{pmatrix} \end{equation}

Which means we can transform the inf problem into:

\begin{equation} \label{eq:part2} \inf_{u(t) = g(x(t))} \begin{pmatrix}x \\ \gamma^{-1}\beta^T x + u\end{pmatrix}^T \begin{pmatrix} \alpha - \beta \gamma^{-1} \beta^T& 0 \\ 0 & \gamma \end{pmatrix} \begin{pmatrix} x \\ \gamma^{-1} \beta^T x + u \end{pmatrix} \end{equation} From the above equation we can now see that the input $u(t)$ can do nothing to influence the term $x^T \left(\alpha - \beta \gamma^{-1} \beta^T \right) x$, but if we choose $u(t) = -\gamma^{-1} \beta^T x(t)$ we see that the above equation becomes: \begin{multline} \label{eq:part3} \begin{pmatrix}x(t) \\ 0\end{pmatrix}^T \begin{pmatrix} \alpha - \beta \gamma^{-1} \beta^T& 0 \\ 0 & \gamma \end{pmatrix} \begin{pmatrix} x(t) \\ 0 \end{pmatrix} = x^T(t) \left[\alpha - \beta \gamma^{-1} \beta^T \right] x(t) = \\ x^T(t) \left[A^T Q_x A - \left(A^T Q_x B + A^T Q_{x,u}\right) \right. \\ \left. \left( B^TQ_xB + B^TQ_{x,u} + Q_{x,u}^TB + Q_u\right)^{-1} \left(A^T Q_x B + A^T Q_{x,u}\right)^T\right] x(t) \end{multline} The optimal input $u^* \in \mathbb{R}^m$ according to the infimization problem therefore results in $u^*(t) = -\left( B^TQ_xB + B^TQ_{x,u} + Q_{x,u}^TB + Q_u\right)^{-1}\left(A^T Q_x B + A^T Q_{x,u}\right)^T x(t)$.