Consider the not preconditioned CG-method for a linear system $Ax=b$. Define $\beta_j = \frac{(r_{j+1},r_{j+1})}{(r_j,r_j)}$ and $\alpha_j=\frac{(r_j,r_j)}{(Ap_j,p_j)}$, where $(x,y) = y^Tx$ denotes the inner product, and $p_j=r_j+\beta_{j-1}p_{j-1}$, $p_0=r_0$.
Then the following holds:
$T_m=R_m^T A R_m$ where $R_m:=\left(\frac{r_0}{\|r_0\|},...,\frac{r_{m-1}}{\|r_{m-1}\|}\right)$ is the matrix with the residual vectors in each column and $$T_m=\begin{pmatrix} \alpha_0^{-1} & \frac{\sqrt\beta_0}{\alpha_0} & & & \\ \frac{\sqrt\beta_0}{\alpha_0} & \frac{1}{\alpha_1} +\frac{\beta_0}{\alpha_0} & \ddots & & \\ & \ddots & \ddots & \ddots & \\ & & \ddots & \ddots & \frac{\beta_{m-2}}{\alpha_{m-2}} \\ & & & \frac{\beta_{m-2}}{\alpha_{m-2}} & \frac{1}{\alpha_{m-1}}+\frac{\beta_{m-2}}{\alpha_{m-2}} \end{pmatrix}$$
I thought that, since $R_m$ is orthogonal, we can write $R_m^TAR_m=T_m$ as $A=R_mT_mR_m^T$ and simply calculate this, but this quickly got messy and confusing.
Hence: I tried induction over $m$. (EDIT: Since this led to confusion: I am trying to show the original proposition here, i.e. $T_m=R_m^TAR_m$, and not my - seemingly wrong - changed version aka $A=...$)
For $m=1$ the assumption obviously holds.
But - of course - I got stuck doing the calculation for $m\to m+1$. I tried a direct approach again, but quickyl lost track of what I'm doing.
How can I show this without having to calculate the matrix-products? Any input is welcome :) Thanks!
The relation can actually be obtained directly from CG without any reference to the Lanczos algorithm. It is quite straightforward, but very tedious. Let $P_m=[p_0,\ldots,p_m]$ and $R_m=[r_0/\|r_0\|,\ldots,r_m/\|r_m\|]$. Let also $S_j=\mathrm{diag}(\|r_0\|,\ldots,\|r_j\|)$, so that $R_mS_m$ is a matrix of non-normalized residual vectors. From $r_0=p_0$ and $r_j=p_j-\beta_{j-1}p_{j-1}$, $j=1,\ldots,m$, we have $$ R_mS_m=P_mU_m, \quad U_m=\begin{bmatrix} 1 & -\beta_0 & & \\ & \ddots & \ddots & \\ & & \ddots & -\beta_{j-1} \\ & & & 1 \end{bmatrix}.\tag{1} $$ The matrix $U_m$ is a nonsingular upper bidiagonal matrix with $1$ on the main diagonal and $-\beta_0,\ldots,-\beta_{j-1}$ on the first super-diagonal. Now consider $r_j=r_{j-1}-\alpha_{j-1}Ap_{j-1}$ for $j=0,\ldots,m+1$, from which we have $$ AP_m=R_{m+1}S_{m+1}\underline{L}_{m+1,m}D_m^{-1},\tag{2} $$ where $\underline{L}_{m+1,m}$ is an $(m+2)\times (m+1)$ matrix with $1$ on the main diagonal and $-1$ on the first sub-diagonal (hence a rectangular lower bidiagonal matrix), and $D_m=\mathrm{diag}(\alpha_0,\ldots,\alpha_m)$. Putting both (1) and (2) together leads to $$ AR_mS_mU_m^{-1}=R_{m+1}S_{m+1}\underline{L}_{m+1,m}D_m^{-1} \quad \Rightarrow \quad AR_m=R_{m+1}S_{m+1}\underline{L}_{m+1,m}D_m^{-1}U_mS_m^{-1}. $$ Now since $R_m^TR_{m+1}=[I,0]$ ($(m+1)\times(m+1)$ identity augmented by a zero vector, we get $$ R_m^TAR_m=[I,0]S_{m+1}\underline{L}_{m+1,m}D_m^{-1}U_mS_m^{-1} =S_mL_mD_m^{-1}U_mS_m^{-1}, $$ where $L_m$ is the upper $(m+1)\times(m+1)$ part of $\underline{L}_{m+1,m}$, that is, $$ L_m = \begin{bmatrix} 1 & & & \\ -1 & \ddots & & \\ & \ddots & 1 & \\ & & -1 & 1 \end{bmatrix}\in\mathbb{R}^{(m+1)\times(m+1)}. $$ Since $S_m$ and $D_m$ are diagonal and $L_m$ and $U_m$ are, respectively, lower and upper bidiagonal, the matrix $T_m$ is tridiagonal. To obtain the entries of $T_m$, "simply" compute the product :-) For the diagonal elements, we have $$ (T_m)_{1,1} = \|r_0\|\frac{1}{\alpha_0}\frac{1}{\|r_0\|}=\frac{1}{\alpha_0} $$ and for $i\geq 2$, $$ (T_m)_{i,i} = \|r_{i-1}\|[-1,1]\begin{bmatrix}\alpha_{i-2}^{-1} & 0 \\ 0 & \alpha_{i-1}^{-1}\end{bmatrix}\begin{bmatrix}-\beta_{i-2}\\1\end{bmatrix}\frac{1}{\|r_{i-1}\|}= \frac{1}{\alpha_{i-1}}+\frac{\beta_{i-2}}{\alpha_{i-2}}. $$ Similarly, the off-diagonal entries of $T_m$ can be evaluated using $\sqrt{\beta_{j-1}}=\|r_j\|/\|r_{j-1}\|$. Good luck :-)