Let A be a near-Jordan block, that is, a matrix obtained from a Jordan block by possibly changing the first column. Prove that no two Jordan blocks in any Jordan canonical form for A have the same eigenvalue
2026-03-30 05:00:16.1774846816
Jordan Block Eigenvalue Proof
1.4k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in EIGENVALUES-EIGENVECTORS
- Stability of system of parameters $\kappa, \lambda$ when there is a zero eigenvalue
- Stability of stationary point $O(0,0)$ when eigenvalues are zero
- Show that this matrix is positive definite
- Is $A$ satisfying ${A^2} = - I$ similar to $\left[ {\begin{smallmatrix} 0&I \\ { - I}&0 \end{smallmatrix}} \right]$?
- Determining a $4\times4$ matrix knowing $3$ of its $4$ eigenvectors and eigenvalues
- Question on designing a state observer for discrete time system
- Evaluating a cubic at a matrix only knowing only the eigenvalues
- Eigenvalues of $A=vv^T$
- A minimal eigenvalue inequality for Positive Definite Matrix
- Construct real matrix for given complex eigenvalues and given complex eigenvectors where algebraic multiplicity < geometric multiplicity
Related Questions in JORDAN-NORMAL-FORM
- Simultaneous diagonalization on more than two matrices
- $ \exists \ g \in \mathcal{L}(E)$ s.t. $g^2 = f \ \iff \forall \ k$, $\dim \ker(f-aId)^k$ is even
- Relation between left and right Jordan forms
- About Matrix function on Jordan normal form
- Generalized Eigenvectors when algebraic multiplicity greater than 1
- Commutativity and Jordan Decomposition
- Jordan forms associated with characteristic polynomials and minimal polynomials
- Jordan's Canonical Form of a Matrix
- $3 \times 3$-matrices with the same characteristic polynomials and minimal polynomials that are not similar
- Jordan form of a matrix confusion
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $A \in M_n(\mathbb{F})$ be a perturbation of an $n \times n$ Jordan block with eigenvalue $\lambda \in \mathbb{F}$ so
$$ A = \begin{pmatrix} a_1 & 1 & 0 & \dots & 0 \\ a_2 & \lambda & 1 & \ddots & \vdots \\ \vdots & \ddots & \ddots & \ddots & 0 \\ a_{n-1} & \ddots & 0 & \lambda & 1 \\ a_n & 0 & 0 & 0 & \lambda \end{pmatrix}. $$
We can write $A = B + C$ where $B = \lambda I + N$ and
$$ N = \begin{pmatrix} 0 & 1 & 0 & \dots & 0 \\ 0 & 0 & 1 & \ddots & \vdots \\ \vdots & \ddots & \ddots & \ddots & 0 \\ 0 & \ddots & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 & 0 \end{pmatrix}, C = \begin{pmatrix} a_1 - \lambda & 0 & \dots & 0 \\ a_2 & 0 & \dots & 0 \\ \vdots & \vdots & \dots & \vdots \\ a_{n-1} & 0 & \dots & 0 \\ a_n & 0 & \dots & 0\end{pmatrix}. $$
Proving that any two blocks in the Jordan form of $A$ are associated to different eigenvalues is equivalent to proving that the minimal polynomial of $A$ and the characteristic polynomial of $A$ coincide. By Cayley-Hamilton, it is enough to prove that $p(A) \neq 0$ for all polynomials of degree $0 < k < n$.
For $B$, we know that the characteristic and the minimal polynomial coincide. Let us recall why. Denote by $(e_1, \dots, e_n)$ the standard basis vectors. We have $N^i e_n = e_{n-i}$. Now, if $p \in \mathbb{F}[X]$ is a polynomial of degree $k < n$, we claim that $p(B)e_n \in \operatorname{span} \{ e_n, \dots, e_{n-k} \}$ and $p(B)e_n = 0$ if and only if $p = 0$. To see this, write $p(x + \lambda) = g(x) + p(\lambda)$ where $g(x)$ is a polynomial of degree $k$ satisfying $g(0) = 0$. Writing $g(x) = \sum_{i=1}^k b_i X^i$, we have
$$ p(B) = p(N + \lambda I) = g(N) + p(\lambda)I = \sum_{i=1}^k b_i N^i + p(\lambda)I, \\ p(B)e_n = \sum_{i=1}^k b_i e_{n-i} + p(\lambda) e_n $$
which shows that $p(B)e_n \in \operatorname{span} \{ e_n, \dots, e_{n-k} \}$ and $p(B)e_n = 0$ if and only if $b_1 = \dots = b_k = 0$ and $p(\lambda) = 0$ which implies that $p = 0$.
Let us return to $A$. Note that $Ce_i = 0$ for $2 \leq i \leq n$. This implies by induction that $A^i e_n = B^i e_n$ for $0 \leq i < n$ as
$$ A^{i+1} e_n = A(A^i e_n) = A(B^i e_n) = (B + C)(B^i e_n) = B^{i+1} e_n + CB^i e_n $$
but since $B^i e_n \in \operatorname{span} \{ e_n, \dots, e_{n-i} \}$ and $i + 1 < n$ we have $CB^i e_n = 0$. Now, if $p(X) = \sum_{i=0}^k b_i X^i$ is a polynomial of degree $k < n$ that satisfies $p(A) = 0$, we have
$$ 0 = p(A)e_n = \sum_{i=0}^k b_i A^i e_n = \sum_{i=0}^k b_i B^i e_n = p(B) e_n $$
and by what we have shown before $p = 0$.
Alternatively, following the suggestion of darij grinberg and using the notation above, note that $A = \lambda I + (C + N)$ and $C + N$ is a matrix that is similar to a companion matrix:
$$ C + N = \begin{pmatrix} a_1 - \lambda & 1 & 0 & \dots & 0 \\ a_2 & 0 & 1 & \ddots & \vdots \\ \vdots & \ddots & \ddots & \ddots & 0 \\ a_{n-1} & \ddots & 0 & 0 & 1 \\ a_n & 0 & 0 & 0 & 0 \end{pmatrix} \sim \begin{pmatrix} 0 & 0 & \dots & 0 & a_n \\ 1 & 0 & \dots & 0 & a_{n-1} \\ 0 & 1 & \dots & 0 & a_{n-2} \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & \dots & 1 & (a_1 - \lambda) \end{pmatrix} $$
Now, we have $p(A) = 0$ if and only if $g(N + C) = 0$ where $g(x) = p(x + \lambda)$. Since $N + C$ is similar to a companion matrix, the minimal polynomial of $N + C$ has degree $n$ and so does the minimal polynomial of $A$. This method has the advantage of giving out explicitly the characteristic and minimal polynomial of $A$ which is
$$ X^n - \left( (a_1 - \lambda)(X - \lambda)^{n-1} + a_2(X - \lambda)^{n-2} + \dots + a_{n-1}(X - \lambda) + a_n \right). $$