I have two eigen vectors \begin{pmatrix} 1 \\0 \end{pmatrix} and \begin{pmatrix} 0 \\2 \end{pmatrix}. So will the eigen basis be \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix} or \begin{pmatrix} 0 & 1 \\ 2 & 0 \end{pmatrix} ?
2026-03-30 06:43:02.1774852982
Bumbble Comm
On
Does the order matter while writing a Conversion Matrix?
901 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
Bumbble Comm
On
The order of the eigenvectors doesn’t matter per se. However, there might be other constraints in play. For instance, it’s common for the eigenvalues in the diagonal matrix to be arranged in either descending or ascending order. In that case, the eigenvectors must be arranged to match.
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in EIGENVALUES-EIGENVECTORS
- Stability of system of parameters $\kappa, \lambda$ when there is a zero eigenvalue
- Stability of stationary point $O(0,0)$ when eigenvalues are zero
- Show that this matrix is positive definite
- Is $A$ satisfying ${A^2} = - I$ similar to $\left[ {\begin{smallmatrix} 0&I \\ { - I}&0 \end{smallmatrix}} \right]$?
- Determining a $4\times4$ matrix knowing $3$ of its $4$ eigenvectors and eigenvalues
- Question on designing a state observer for discrete time system
- Evaluating a cubic at a matrix only knowing only the eigenvalues
- Eigenvalues of $A=vv^T$
- A minimal eigenvalue inequality for Positive Definite Matrix
- Construct real matrix for given complex eigenvalues and given complex eigenvectors where algebraic multiplicity < geometric multiplicity
Related Questions in LINEAR-TRANSFORMATIONS
- Unbounded linear operator, projection from graph not open
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- A different way to define homomorphism.
- Linear algebra: what is the purpose of passive transformation matrix?
- Find matrix representation based on two vector transformations
- Is $A$ satisfying ${A^2} = - I$ similar to $\left[ {\begin{smallmatrix} 0&I \\ { - I}&0 \end{smallmatrix}} \right]$?
- Let $T:V\to W$ on finite dimensional vector spaces, is it possible to use the determinant to determine that $T$ is invertible.
- Basis-free proof of the fact that traceless linear maps are sums of commutators
- Assuming that A is the matrix of a linear operator F in S find the matrix B of F in R
- For what $k$ is $g_k\circ f_k$ invertible?
Related Questions in MATRIX-EQUATIONS
- tensor differential equation
- Can it be proved that non-symmetric matrix $A$ will always have real eigen values?.
- Real eigenvalues of a non-symmetric matrix $A$ ?.
- How to differentiate sum of matrix multiplication?
- Do all 2-variable polynomials split into linear factors over the space of $2 \times 2$ complex matrices?
- Big picture discussion for iterative linear solvers?
- Matrix transformations, Eigenvectors and Eigenvalues
- Jordan chevaley decomposition and cyclic vectors
- If $A$ is a $5×4$ matrix and $B$ is a $4×5$ matrix
- Simplify $x^TA(AA^T+I)^{-1}A^Tx$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Suppose $A$ is the original matrix whose eigenvectors you computed. Suppose that $\xi_1 = \begin{pmatrix} 1 \\ 0 \end{pmatrix} $ is an eigenvector corresponding to the eigenvalue $\lambda_1$, while $\xi_2 = \begin{pmatrix} 0 \\ 2 \end{pmatrix} $ correpsonds to the eigenvalue $\lambda_2$. Define the following matrices: \begin{equation} P = \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix} \quad \text{and} \quad Q= \begin{pmatrix} 0 & 1 \\ 2 & 0 \end{pmatrix}. \end{equation}
The difference between $P$ and $Q$ is the effect they have on the diagonal representations of $A$: \begin{equation} P^{-1} A P = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix} \quad \text{whereas} \quad Q^{-1}AQ = \begin{pmatrix} \lambda_2 & 0 \\ 0 & \lambda_1 \end{pmatrix}. \end{equation}
So, the answer to your question is: it depends on what you mean by "matter". If you mean "does it make a difference", then the answer is yes, because as I mentioned above, the eigenvalues will be swapped in the diagonal representation of $A$. But, if you mean "is there any inherent reason to choose one over the other?" the answer is no; you just need to be aware of where the eigenvalues go.
Added Remark:
Just a comment about your terminology: you said "will the eigen basis be ...", but then you proceed to list the matrices $P$ and $Q$ I defined. I hope you know that $P$, $Q$ are called the "change of basis/coordinate matrix", whereas the eigenbasis is the basis of eigenvectors: $\{\xi_1, \xi_2\}$.
If you want to be more specific, you could speak of the "ordered eigenbasis" $\{\xi_1, \xi_2\}$ and $\{\xi_2, \xi_1\}$, to emphasise that you want to keep track of the order.
I only brought this up for your knowledge, if you just happened to be a little imprecise and you know the distinction, then of course ignore this remark.