the unitary operator definition is $ ⟨Tv,Tw⟩ = ⟨v, w⟩$
for every $v, w$ in $V.$
can you please explain the intuition and what the formal definition actually means?
why unitary operator preserves the orthonormal basis?
Can we infer from having unitary operator that we have eigenvectors?
And why when $T^* = T^{-1}$, $T$ preserves the inner product and therefore preserves the the orthonormal basis and the length and distance?
thank you
2026-03-25 16:06:21.1774454781
unitary operator explanation
791 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in INNER-PRODUCTS
- Inner Product Same for all Inputs
- How does one define an inner product on the space $V=\mathbb{Q}_p^n$?
- Inner Product Uniqueness
- Is the natural norm on the exterior algebra submultiplicative?
- Norm_1 and dot product
- Is Hilbert space a Normed Space or a Inner Product Space? Or it have to be both at the same time?
- Orthonormal set and linear independence
- Inner product space and orthogonal complement
- Which Matrix is an Inner Product
- Proof Verification: $\left\|v-\frac{v}{\|v\|}\right\|= \min\{\|v-u\|:u\in S\}$
Related Questions in ADJOINT-OPERATORS
- How to prove that inequality for every $f\in C^\infty_0(\Bbb{R})$.
- Necessary condition for Hermician lin operators
- Is it true that a functor from a locally small category with a left adjoint is representable?
- Showing that these inner product induced norms are equivalent
- Do unitarily equivalent operators have the same spectrum?
- Showing that $\inf_{\|x\|=1}\langle Tx,x\rangle$ and $\sup_{\|x\|=1}\langle Tx,x\rangle$ are eigenvalues of $T$ (in particular when they are $0$)
- Let $T:\mathbb C^3\to\mathbb C^3$.Then, adjoint $T^*$ of $T$
- Role of the interval for defining inner product and boundary conditions in Sturm Liouville problems.
- Checking the well-definedness of an adjoint operator
- Either a self-adjoint operator has $n$ eigenvector or not at all
Related Questions in ORTHOGONAL-MATRICES
- Minimum of the 2-norm
- Optimization over images of column-orthogonal matrices through rotations and reflections
- Functions on $\mathbb{R}^n$ commuting with orthogonal transformations
- A property of orthogonal matrices
- Rotating a matrix to become symmetric
- Question involving orthogonal matrix and congruent matrices $P^{t}AP=I$
- Finding An Orthogonal Transformation Matrix
- Which statement is false ?(Linear algebra problem)
- Every hyperplane contains an orthogonal matrix
- Show non-singularity of orthogonal matrix
Related Questions in UNITARY-MATRICES
- Operator norm and unitary matrix
- Unitary matrices are invertible.
- Square root of unitary matrix
- $AA^*A=A$ with eigenvalues $1$ and $0$, prove that $A$ is unitarily diagonalizable.
- Modifying unitary matrix eigenvalues by right multiplication by orthogonal matrix
- Parametrization of unitary matrices
- Is real power of unitary matrix unitary?
- How to calculate the unitaries satisfying $U_YXU_Y^\dagger=Y$ and $U_ZXU_Z^\dagger=Z$?
- A real symmetric cannot be similar to an antisymmetric matrix
- Numerical stability: cannot unitarily diagonalize normal matrices
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
First of all, recall that the inner product gives us a generalization of the "dot-product" in $\Bbb R^n$. Just as with the dot-product, we can interpret the inner product as giving us the information about the "angle" between two vectors. In particular, because of the Cauchy-Schwarz inequality $|\langle u,v \rangle| \leq \|u\| \cdot \|v\|$, we can always find an angle $\theta$ for which $$ \langle u,v \rangle = \cos \theta \cdot \|u\|\,\|v\| \implies \theta = \cos^{-1} \left( \frac{\langle u,v \rangle}{\|u\| \, \|v\|}\right), $$ and we can think of this $\theta$ as the angle between the vectors $u$ and $v$. In particular: if $u,v$ are unit vectors (if $\|u\| = \|v\| = 1$), then $\langle u, v \rangle = \cos \theta$.
Note that $T$ is unitary if and only if for every $u,v \in V$, we have $\langle Tu,Tv \rangle = \langle u,v \rangle$ We make the following observations:
With this established, it's clear why $T$ should preserve orthonormal bases: if each vector $u_1,\dots,u_n$ has length $1$, then the same applies for $Tu_1,\dots,Tu_n$. Similarly, if the angle between $u_1,u_2$ is $90^\circ$, then the same is true for $Tu_1,Tu_2$. Putting that together: if $u_1,\dots,u_n$ are pairwise-orthogonal unit vectors, then the same is true for $Tu_1,\dots,Tu_n$.
Geometrically, it turns out that every unitary transformation consists of a combination of "reflections" and "rotations". You should verify that rotations and reflections indeed satisfy the two properties listed above, i.e. they preserve both length and the angle between vectors.
Regarding eigenvalues: note that if $x \neq 0$ is an eigenvalue of $T$ with $Tx = \lambda x$, then it would necessarily follow that $$ \|x\|^2 = \langle x,x \rangle = \langle Tx,Tx \rangle = \langle \lambda x, \lambda x \rangle = \lambda \bar \lambda \langle x, x \rangle = |\lambda|^2 \|x\|^2. $$ In other words, the eigenvalue $\lambda$ must satisfy $|\lambda| = 1$. If $\lambda$ is real, then either we have $\lambda = 1$ (so that $x$ is a fixed point of $T$) or $\lambda = -1$ (so that $T$ reflects $x$ across the origin). Complex eigenvalues of magnitude $1$ correspond to "rotation" in a sense.
For the finite dimensional case, the spectral theorem tells us that every transformation $T$ can be diagonalized, so that we can describe the transformation completely in terms of its eigenvectors.
Regarding the fact that $T^* = T^{-1}$: it is difficult to get a generalized, geometric, intuitive sense for the meaning of the relationship between $T^*$ and $T$. I think the easiest way to think about this fact is via the proof. Note that by definition, we have $\langle x,Ty \rangle = \langle T^*x, y \rangle$. It follows that for every vector $u,v$, $$ \langle u, v \rangle = \langle Tu,Tv \rangle = \langle T^*Tu, v \rangle. $$ In other words, $\langle u, v \rangle$ is always the same as $\langle T^*Tu, v \rangle.$ This can only happen, however, if we always have $T^*Tu = u$. That is, $T^*T = I$.
If $V$ is finite dimensional, this is enough to conclude that $T^* = T^{-1}.$