Let $A \in \mathbb{R}^{n \times n} $ be symmetric positive definite with positive diagonal entries.
I'm trying to show that at each step $m$ of gaussian elimination $$ a^{(m+1)}_{ij} = a^{(m)}_{i,j} - \frac{a^{(m)}_{im}}{a^{(m)}_{mm}}a^{(m)}_{mj}\; i, j = m+1, \ldots, n$$
The submatrix $\left\{ a^{(m+1)}_{ij}\right\}_{i,j=m+1}^{n}$ is also symmetric positive definite.
The symmetric part is obvious. It's the positive definite part i'm having trouble with.
To show positive defenitenes we need to show that for any $x \in \mathbb{R}^n$ $$ \sum_{i=m+1}^{n} \sum_{j=m+1}^{n} a^{(m+1)}_{ij}x_{i}x_{j} > 0.$$
What I've tried so far:
First substitute in $a^{(m+1)}_{ij}$, $$\sum_{i=m+1}^{n} \sum_{j=m+1}^{n} \left( a^{(m)}_{i,j} - \frac{a^{(m)}_{im}}{a^{(m)}_{mm}}a^{(m)}_{mj} \right)x_{i}x_{j}$$
WLOG let $x_i = 0 \text{ for } i=1 \ldots m $, then this becomes
$$\sum_{i=1}^{n} \sum_{j=1}^{n} \left( a^{(m)}_{i,j} - \frac{a^{(m)}_{im}}{a^{(m)}_{mm}}a^{(m)}_{mj} \right)x_{i}x_{j}$$
$$=\sum_{i=1}^{n} \sum_{j=1}^{n} a^{(m)}_{i,j}x_{i}x_{j} - \frac{1}{a^{(m)}_{mm}} \sum_{i=1}^{n} \sum_{j=1}^{n} a^{(m)}_{im}a^{(m)}_{mj}x_{i}x_{j} $$
$$ = x^{T}A^{(m)}x - \frac{1}{a^{(m)}_{mm}} {\left( \sum_{j=1}^{n} a^{(m)}_{mj}x_{j}\right)}^{2} $$
$$ = x^{T}A^{(m)}x - \frac{1}{a^{(m)}_{mm}} {\left( e_{m}^{T}A^{(m)}x\right)}^{2} $$
Where $e_{m}$ is the $m$-th standard basis vector.
This is where I'm stuck.
It is frequently possible to avoid sum by partitioning the problem. Let us partition the matrix $A \in \mathbb{R}^{n \times n}$ such that \begin{equation} A = \begin{pmatrix} \alpha & v^T \\ v & A' \end{pmatrix} \end{equation} where $\alpha \in \mathbb{R}$, $v \in \mathbb{R}^{n-1}$ and $A' \in \mathbb{R}^{(n-1) \times (n-1)}$. Our objective is to show that the matrix $A' - \frac{1}{\alpha} vv^T$ is symmetric positive definite. It is suffices to show that \begin{equation} y^T (A' - \frac{1}{\alpha} vv^T) y > 0 \end{equation} for all nonzero vectors $y \in \mathbb{R}^{n-1}$. Let therefore $y \not = 0$ be given. By assumption $A$ is symmetric positive definite, so let $x \in \mathbb{R}$ be any real number. Then \begin{equation} 0 < (x,y^T) \begin{pmatrix} \alpha & v^T \\ v & A' \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = \alpha x^2 + 2 x v^T y + y^T A y. \end{equation} The choice of $x = - \frac{1}{\alpha} v^T y$ now allows us to conclude that \begin{equation} 0 < \alpha x^2 + 2 x v^Ty + y^T A' y = \frac{1}{\alpha}(v^T y)^2 - \frac{2}{\alpha} (v^T y)^2 + y^T A' y = y^T A' y -\frac{1}{\alpha}(v^T y)^2 = y^T (A' - \frac{1}{\alpha} vv^T) y \end{equation} which is exactly what is needed. One can then proceed inductively.