Consider symmetric matrices $A,B \in \mathbb{R}^{n \times n}$ and let the difference matrix $D = B - A.$ If $\bf{\hat{a}}_i$ and $\bf{\hat{b}}_i$ are the $i^{\text{th}}$ eigenvectors corresponding to $A$ and $B$ respectively, and $\lambda_i$ is the $i^{\text{th}}$ eigenvalue of $A$, then the first-order Taylor expansion gives the following approximation for the difference between the primary eigenvectors: \begin{align} {\bf{\hat{b}}_1 - \bf{\hat{a}}_1} &= \sum_{i = 2}^n \frac{{{\bf{\hat{a}}_i}^T} D\; {\bf{\hat{a}}_1}}{\lambda_1 - \lambda_i}{\bf{\hat{a}}_i} + O(D^2)\\ &\approx \sum_{i = 2}^n \frac{{{\bf{\hat{a}}_i}^T} D\; {\bf{\hat{a}}_1}}{\lambda_1 - \lambda_i}{\bf{\hat{a}}_i} \end{align} My question is, where does this Taylor expansion come from?
2026-04-13 12:01:25.1776081685
Taylor Expansion of Eigenvector Perturbation
1.9k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in CALCULUS
- Equality of Mixed Partial Derivatives - Simple proof is Confusing
- How can I prove that $\int_0^{\frac{\pi}{2}}\frac{\ln(1+\cos(\alpha)\cos(x))}{\cos(x)}dx=\frac{1}{2}\left(\frac{\pi^2}{4}-\alpha^2\right)$?
- Proving the differentiability of the following function of two variables
- If $f ◦f$ is differentiable, then $f ◦f ◦f$ is differentiable
- Calculating the radius of convergence for $\sum _{n=1}^{\infty}\frac{\left(\sqrt{ n^2+n}-\sqrt{n^2+1}\right)^n}{n^2}z^n$
- Number of roots of the e
- What are the functions satisfying $f\left(2\sum_{i=0}^{\infty}\frac{a_i}{3^i}\right)=\sum_{i=0}^{\infty}\frac{a_i}{2^i}$
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- How to prove $\frac 10 \notin \mathbb R $
- Proving that: $||x|^{s/2}-|y|^{s/2}|\le 2|x-y|^{s/2}$
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in EIGENVALUES-EIGENVECTORS
- Stability of system of parameters $\kappa, \lambda$ when there is a zero eigenvalue
- Stability of stationary point $O(0,0)$ when eigenvalues are zero
- Show that this matrix is positive definite
- Is $A$ satisfying ${A^2} = - I$ similar to $\left[ {\begin{smallmatrix} 0&I \\ { - I}&0 \end{smallmatrix}} \right]$?
- Determining a $4\times4$ matrix knowing $3$ of its $4$ eigenvectors and eigenvalues
- Question on designing a state observer for discrete time system
- Evaluating a cubic at a matrix only knowing only the eigenvalues
- Eigenvalues of $A=vv^T$
- A minimal eigenvalue inequality for Positive Definite Matrix
- Construct real matrix for given complex eigenvalues and given complex eigenvectors where algebraic multiplicity < geometric multiplicity
Related Questions in TAYLOR-EXPANSION
- Mc Laurin and his derivative.
- Maclaurin polynomial estimating $\sin 15°$
- why can we expand an expandable function for infinite?
- Solving a limit of $\frac{\ln(x)}{x-1}$ with taylor expansion
- How to I find the Taylor series of $\ln {\frac{|1-x|}{1+x^2}}$?
- Proving the binomial series for all real (complex) n using Taylor series
- Taylor series of multivariable functions problem
- Taylor series of $\frac{\cosh(t)-1}{\sinh(t)}$
- The dimension of formal series modulo $\sin(x)$
- Finding Sum of First Terms
Related Questions in MATRIX-CALCULUS
- How to compute derivative with respect to a matrix?
- Definition of matrix valued smooth function
- Is it possible in this case to calculate the derivative with matrix notation?
- Monoid but not a group
- Can it be proved that non-symmetric matrix $A$ will always have real eigen values?.
- Gradient of transpose of a vector.
- Gradient of integral of vector norm
- Real eigenvalues of a non-symmetric matrix $A$ ?.
- How to differentiate sum of matrix multiplication?
- Derivative of $\log(\det(X+X^T)/2 )$ with respect to $X$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
This is essentially the same process as what is called time-independent perturbation theory in quantum mechanics. Provided the $a_i$ form an orthonormal basis, so $ a_i^T a_j = \delta_{ij} $, we can always write $$ b_1 = \alpha\left( a_1 + \sum_{j \neq 1} \beta_j a_j \right), \\ \mu_1 = \lambda_1 + \nu, $$ where $\mu_1$ is the eigenvalue of $B$ with eigenvector $b_1$, and $\alpha$ is chosen so that $b_1^Tb_1=1$. Then the eigenvalue equation for $B$ is $$ Bb_1 = \mu_1 b_1, $$ which becomes $$ (A+D)\Big(a_i + \sum_{j \neq 1} \beta_j a_j \Big) = (\lambda_1+\nu)\Big(a_i + \sum_{j \neq 1} \beta_j a_j \Big), $$ cancelling $\alpha$. Using the eigenvalue equation for $a_i$ gives $$ Da_1 + \sum_{j \neq 1} \beta_j Da_j = \sum_{j \neq 1} (\lambda_1-\lambda_j)\alpha_j a_j + \nu \Big(a_1 + \sum_{j \neq 1} \beta_j a_j \Big). $$
Now applying $a_1^T$ and using orthonormality gives $$ \nu = a_1^T D a_1 + \sum_{j \neq 1} \beta_j a_1^T Da_j, \tag{1} $$ while applying $a_k^T$ for $k \neq 1$ gives $$ (\lambda_1-\lambda_j+\nu)\beta_k = a_k^T Da_1 + \sum_{j \neq 1} \beta_j a_k^T Da_j \tag{2} $$
So far, these expressions are exact. We suppose that the eigenvalues and eigenvectors vary smoothly with the matrix, so we can expand as follows: $$ \nu = 0+\epsilon \nu^{(1)} + O(\epsilon^2) \\ \alpha = 1 + \epsilon \alpha^{(1)} + O(\epsilon^2) \\ \beta_j = 0+\epsilon \beta_j^{(1)} + O(\epsilon^2), $$ which reflect the correct behaviour as $\epsilon=\lVert D \rVert \to 0$. Writing $D=\epsilon E$ and substituting these into $(1)$ and $(2)$ gives the first-order equations $$ \nu^{(1)} = a_1^T E a_1, \\ \beta_j^{(1)} = \frac{a_k^T Ea_1}{\lambda_1-\lambda_j}, $$ which is almost the expression you want, but with the necessity to find $\alpha^{(1)}$. We have $$ 1 = (1+2\epsilon\alpha^{(1)}+O(\epsilon^2)) \left( 1 + 2\epsilon\sum_{j \neq 1} \beta_j^{(1)} a_1^T a_j + O(\epsilon^2) \right) = 1+2\epsilon(\alpha^{(1)}+0)+O(\epsilon^2) $$ by orthogonality, so $\alpha^{(1)}=0$, and $\alpha$ does not affect the first-order term. Hence $$ b_1 = a_1 + \sum_{j \neq 1} \frac{a_1^TDa_j}{\lambda_1-\lambda_j}. $$ NB: this requires that $\lambda_1 \neq \lambda_j$. If there are other eigenvectors with the same eigenvalue, the expansion is more complicated. The Wikipedia article on perturbation theory explains this, although it does use quantum mechanics notation.