Show the minimum value of a quadratic form with submatrix

66 Views Asked by At

For $n \ge 2$, $A$ and $A^{-1}$ are n by n positive definite matrices.

With a scalar $\alpha > 0$ and $(n-1)$ column vector $\beta$ and $(n-1) \times (n-1)$ $\Delta$ matrix.

Given that

$$ A^{-1} = \begin{bmatrix} \alpha & \beta^T \\\ \beta & \Delta \end{bmatrix}. $$

We denote the matrix $\tilde{A}$ as the $(n-1)$ by $(n-1)$ submatrix consisting of the rows $2, \ldots, n$ and columns $2, \ldots, n$ of $A$. We know then that the following holds:

$$ \tilde{A}^{-1} = \Delta - \dfrac{\beta\beta^T}{\alpha}. $$

Now, define $x = [x_1,x_2,...,x_n]^T$ and $\tilde{x} = [x_2,x_3,...,x_n]^T$. $x_1 \mapsto x^TA^{-1}x$ is the quadratic form for $x_1$, where we keep the components $x_2, \ldots, x_n$ constant.

I would like to prove that the minimum of this quadratic in function of $x_1$, given $\tilde{x} = [x_2, \ldots, x_n]^T$, is $\tilde{x}\tilde{A}^{-1}\tilde{x}$.

Based on the description,

$$ \begin{align*} & f(x) = x_1 = x^TA^{-1}x, \\ & \dfrac{∂f(x)}{∂x_1} = ??? = 0 \end{align*} $$

yet, that does not meet what I was supposed to show, how can I revise this to show this?

NOTE: This is the following proof of this OP: Proving the equation of positive definite submatrix

(Here the assertion $$\tilde{A}^{-1} = \Delta - \frac{\beta \beta^T}{\alpha}$$ is proved.)

2

There are 2 best solutions below

0
On

For clarity, I'll write your $\tilde{x}$ as an $(n-1)$ column vector $$a = [a_2, \ldots, a_n]^T = \left[\begin{matrix} a_2 \\ \vdots \\a_n \end{matrix}\right],$$ (to make clear it is considered a constant here) and the variable $x_1$ as $x$. We need to show that the minimum of the function $$f : \mathbb{R} \to \mathbb{R} : x \mapsto [x, a_2, \ldots, a_n] A^{-1} [x, a_2, \ldots, a_n]^T = [x \: \: \: a^T] \: A^{-1} \left[\begin{matrix} x \\ a \end{matrix}\right]$$ is equal to $a^T \tilde{A}^{-1} a.$

I'll leave it to you to show that $f$ is differentiable (hint, you can write it as a quadratic function of $x$). First, we show there is precisely one $x$ such that $f'(x) = 0$.

We first calculate \begin{align*} f'(x) &= \left(\frac{d}{dx} [x \: \: \: a^T] \right) A^{-1} \left[\begin{matrix} x \\ a \end{matrix} \right]+ [x \: \: \: a^T] \: A^{-1} \left(\frac{d}{dx} \left[\begin{matrix} x \\ a \end{matrix} \right] \right) \quad \quad (\ast)\\ &= [1, 0, \ldots, 0] \: A^{-1} \left[\begin{matrix} x \\ a\end{matrix} \right] + [x \: \: \: a^T] \: A^{-1} \left[\begin{matrix} 1 \\ 0 \\ \vdots \\ 0\end{matrix} \right] \\ &= [\alpha \: \: \: \beta^T] \: \left[\begin{matrix} x \\ a \end{matrix} \right] + [x \: \: \: a^T] \: \left[\begin{matrix} \alpha \\ \beta \end{matrix} \right] \\ &= 2\alpha x + 2 \beta^T a.\end{align*}

Indeed, we see that for all $x \in \mathbb{R}$, we have $f'(x)= 0$ if and only if $x = -(\beta^T a) / \alpha.$

Filling in this value in $f(x)$, we find $$f\left(-\frac{\beta^T a}{\alpha}\right) = a^T \tilde{A}^{-1} a.$$

Justifying that this is a minimum, can be done by showing that $f''(x) > 0$ for all $x \in \mathbb{R}$. This is indeed the case, as $f''(x) = 2 \alpha > 0$.

( ($\ast$): If this rule is not familiar to you and you cannot justify it, you can also further work out the matrix product $[x \: \: \: a^T] \: A^{-1} \left[\begin{matrix} x \\ a \end{matrix}\right]$ as $$f(x) = [x \: \: \: a^T] \: A^{-1} \left[\begin{matrix} x \\ a \end{matrix}\right] = \alpha x^2 + x (\beta^T a) + (a^T \beta )x + a^T \Delta a,$$ which is a quadratic function of $x$ ($\beta^T a = a^T \beta$ is a constant, the scalar product of $\beta$ and $a$, and $a^T \Delta a$ is also a constant). This is probably easier to differentiate.)

0
On

Let $g(x) = \tilde x^T\tilde A^{-1} \tilde x$ It is easy to prove that

$$f(x) -g(x) = x^T \begin{bmatrix} \alpha & \beta^T \\ \beta & \Delta -\tilde A^{-1}\end{bmatrix} x = \frac1\alpha x^T\begin{bmatrix}\alpha^2 & \alpha \beta^T\\ \alpha\beta & \beta \beta^T\end{bmatrix}x$$

Can you prove that the matrix in the middle is semidefinite positive?