I have over the last year been implementing a code to solve the many-body Schrödinger equation for applications in Nuclear physics, as part of my PhD-studies. The most important step in that code is the use of the Lanczos algorithm to find the lowest eigenvalues of the Hamiltonian matrix. Like most numerical algorithms for diagonalization of symmetric matrices Lanczos reduces the matrix to a tri-diagonal matrix with the same eigenvalues which then can be diagonalized easily. In a summer course I took back in 2017 a lecturer cited a specific theorem that states that any symmetric (diagonalizable) matrix can be reduced to a tri-diagonal matrix. Unfortunately I did not write that down. Now that I'm writing my dissertation I would like to cite that theorem when I explain the Lanczos algorithm. I have been looking for it in every source I can think of but can't find any mentioning of such a theorem. Every text on numerical symmetric matrix diagonalization seem to take that property of symmetric matrices as obvious and don't cite any theorem. I wonder if anyone here knows the name of that theorem, or a source where it is shown?
2026-03-25 19:51:34.1774468294
What is the name of the theorem of tridiagonal reduction of symmetric matrices?
332 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in DIAGONALIZATION
- Determining a $4\times4$ matrix knowing $3$ of its $4$ eigenvectors and eigenvalues
- Show that $A^m=I_n$ is diagonalizable
- Simultaneous diagonalization on more than two matrices
- Diagonalization and change of basis
- Is this $3 \times 3$ matrix diagonalizable?
- Matrix $A\in \mathbb{R}^{4\times4}$ has eigenvectors $\bf{u_1,u_2,u_3,u_4}$ satisfying $\bf{Au_1=5u_1,Au_2=9u_2}$ & $\bf{Au_3=20u_3}$. Find $A\bf{w}$.
- Block diagonalizing a Hermitian matrix
- undiagonizable matrix and annhilating polynom claims
- Show that if $\lambda$ is an eigenvalue of matrix $A$ and $B$, then it is an eigenvalue of $B^{-1}AB$
- Is a complex symmetric square matrix with zero diagonal diagonalizable?
Related Questions in SYMMETRIC-MATRICES
- $A^2$ is a positive definite matrix.
- Showing that the Jacobi method doesn't converge with $A=\begin{bmatrix}2 & \pm2\sqrt2 & 0 \\ \pm2\sqrt2&8&\pm2\sqrt2 \\ 0&\pm2\sqrt2&2 \end{bmatrix}$
- Is $A-B$ never normal?
- Is a complex symmetric square matrix with zero diagonal diagonalizable?
- Symmetry of the tetrahedron as a subgroup of the cube
- Rotating a matrix to become symmetric
- Diagonalize real symmetric matrix
- How to solve for $L$ in $X = LL^T$?
- Showing a block matrix is SPD
- Proving symmetric matrix has positive eigenvalues
Related Questions in TRIDIAGONAL-MATRICES
- Prove that $Q^{T}TQ$ is symmetric and tridiagonal, where $Q,R$ is $QR$ decomposition of symmetric tridiagonal matrix $T$
- Spectrum of tridiagonal block matrix
- The eigenvector of toeplitz matrix
- Generating a random tridiagonal symmetric positive definite matrix
- Inversion of a Tridiagonal Matrices and Recurrence equation
- Using Cholesky decomposition to solve a system of equaions $A^TAx=b$
- What is the rank of $B$?
- Is there a fast way to prove a symmetric tridiagonal matrix is positive definite?
- Is there any specific relationship among the determinant of leading principal submatrices of a tridiagonal matrix?
- Matrix eigenvalues
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
No theorem needs to be invoked in order to perform the similarity transformation in the Lanczos algorithm. It is correct by construction.
In the comments, you cited this link as a reference. In section 2.1, $α_i$, $β_i$, and $v_i$ are defined by Eq. (1). $T$ is defined as the tridiagonal matrix using $α_i$ and $β_i$. $V$ is defined using $v_i$. When you have defined $T$ and $V$ in this way, if you re-write Eq. (1) as a matrix equation using these matrices, you get Eq. (3). And Eq. (3) is an orthogonal similarity transform.
This is why the algorithm is so brilliant. If you pick the construction correctly, you automatically get a similarity transformation.
If you still have questions about the algorithm, then the misunderstanding involves one of the steps above, probably the step where you re-write Eq. (1) as a matrix equation.
As an aside, I want to point out that there are important things left to be proved; it's just that they come after Eq. (3). There are at least three things that pop out at me:
Typically we halt the iteration before getting to $n$ and argue that $T$'s extreme eigenvalues and eigenvectors are similar to $A$'s. This needs proof, and I have never read a full proof of this, but I have heard that it is easy for Hermitian matrices and hard for non-Hermitian matrices (Arnoldi iteration).
How do we know that none of the $v_i$ are zero vectors? (If any of them were zero, then $V$ would not be an orthogonal matrix. We do, however, know that they are all orthogonal by construction.)
How do we know that the algorithm will be numerically stable in practice? (The naive approach isn’t.)