I have a doubt about Hilbert space, if I have an orthonormal basis, I can't find any other orthonormal vector in the Hilbert space. But zero element is orthogonal to every element of the space, and obviously it is in the space, how is it possible?
2026-02-22 21:27:57.1771795677
Zero element in an Hilbert space is orthogonal?
1.1k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in SOFT-QUESTION
- Reciprocal-totient function, in term of the totient function?
- Ordinals and cardinals in ETCS set axiomatic
- Does approximation usually exclude equality?
- Transition from theory of PDEs to applied analysis and industrial problems and models with PDEs
- Online resources for networking and creating new mathematical collaborations
- Random variables in integrals, how to analyze?
- Could anyone give an **example** that a problem that can be solved by creating a new group?
- How do you prevent being lead astray when you're working on a problem that takes months/years?
- Is it impossible to grasp Multivariable Calculus with poor prerequisite from Single variable calculus?
- A definite integral of a rational function: How can this be transformed from trivial to obvious by a change in viewpoint?
Related Questions in HILBERT-SPACES
- $\| (I-T)^{-1}|_{\ker(I-T)^\perp} \| \geq 1$ for all compact operator $T$ in an infinite dimensional Hilbert space
- hyponormal operators
- a positive matrix of operators
- If $S=(S_1,S_2)$ hyponormal, why $S_1$ and $S_2$ are hyponormal?
- Is the cartesian product of two Hilbert spaces a Hilbert space?
- Show that $ Tf $ is continuous and measurable on a Hilbert space $H=L_2((0,\infty))$
- Kernel functions for vectors in discrete spaces
- The space $D(A^\infty)$
- Show that $Tf$ is well-defined and is continious
- construction of a sequence in a complex Hilbert space which fulfills some specific properties
Related Questions in ORTHOGONALITY
- Functions on $\mathbb{R}^n$ commuting with orthogonal transformations
- Proving set of orthogonal vectors is linearly indpendent
- Find all vectors $v = (x,y,z)$ orthogonal to both $u_1$ and $u_2$.
- Calculus III Vector distance problem.
- Is there a matrix which is not orthogonal but only has A transpose A equal to identity?
- Number of Orthogonal vectors
- Find the dimension of a subspace and the orthogonality complement of another
- Forming an orthonormal basis with these independent vectors
- orthogonal complement - incorrect Brézis definition
- Orthogonal Projection in Inner Product
Related Questions in ORTHONORMAL
- Orthonormal basis for $L^2(\mathbb{R}^n,\mathbb{F})$
- What is $\| f \|$ where $f(x)=\sum\limits_{n=1}^\infty \frac{1}{3^n} \langle x,e_n\rangle$
- Forming an orthonormal basis with these independent vectors
- Orthogonal Function Dirac Delta Series
- Sum of two rank $1$ matrices with some property gives rank $2$ matrix
- Prove that $\lVert X\rVert^2 =\sum_{i,j=1}^\infty\lvert\langle u_i,Xu_j\rangle\rvert^2$.
- Is there any connection between the fact that a set of vectors are mutually orthogonal and the same set of vectors are linearly independent
- Compute the norm of a linear operator using a normal basis in an infinite Hilbert space
- If $M$ is the span of a finite orthonormal set in a Hilbert space then $M$ is closed
- Complete orthonormal system in P2(R)
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
This is a question of unwinding the definitions, and also a matter of stating a theorem precisely.
Definition of orthogonal: vectors $\mathbf{v}$ and $\mathbf{w}$ are orthogonal iff $\langle \mathbf{v}, \mathbf{w}\rangle=0$. This is clearly true if $\mathbf{v}=0$ and $\mathbf{w}$ is any vector whatsoever, so the zero vector is orthogonal to all vectors.
Definition of a span $[\mathbf{v}_1,\ldots,\mathbf{v}_n]$: the set of all vectors that can be expressed in the form $a_1\mathbf{v}_1+\cdots+a_n\mathbf{v}_n$, for scalars $a_1,\ldots,a_n$. (For simplicity let's deal only with the finite dimensional case.) Clearly the zero vector is in the span, just by taking all the $a_i$'s equal to 0.
Now the theorem you seem to have implicitly in mind is this: if $\mathbf{w}$ is orthogonal to all the vectors $\{\mathbf{v}_1,\ldots,\mathbf{v}_n\}$, and $\mathbf{w}$ is in the span $[\mathbf{v}_1,\ldots,\mathbf{v}_n]$, then $\mathbf{w}=\mathbf{0}$. (True provided the inner product is positive definite, which is part of the definition for Hilbert spaces.) This doesn't contradict either definition. Corollary: any vector orthogonal to all the vectors of a basis is zero.
I suppose someone might carelessly state this result as, "No vector can be orthogonal to all vectors of a basis", rather than "Only the zero vector can be orthogonal to all the vectors of a basis". The careless version is false.
Another example, suggested by one of your comments: $W\cap W^\perp=\{\mathbf{0}\}$. Someone might carelessly say, "A subspace and its orthogonal complement are disjoint", rather than "The intersection of a subspace and its orthogonal complement consists solely of the zero vector".
Let me use this question as an excuse to ramble on about mathematical exposition in general. It's a special case of the expert problem: if you thoroughly understand a subject, it's hard to put yourself in the place of a novice. Imprecise, technically incorrect statements whose overall sense is true flow easily from the lips or the fingers, especially if the correct version is more awkward or verbose. I think learning to navigate around these "abuses of language" is part of what people mean by "mathematical maturity". A really great expositor knows how to avoid strewing these pebbles (or boulders) in the path of the reader, without sacrificing an easy conversational style.
Another commenter suggested defining "orthogonal" as a relation between nonzero vectors, so saying $\mathbf{v}$ and $\mathbf{w}$ are orthogonal would automatically imply that $\mathbf{v}$ and $\mathbf{w}$ are both nonzero. I don't like this, for three reasons. First, it disagrees with standard terminology. Second, getting used to the special role of the zero vector is part of getting past the novice stage. Third, the standard definition is logically "cleaner". For example, $W^\perp$ is defined as the set of all vectors that are orthogonal to all vectors in $W$. But with this modified definition, you have to add, "plus the zero vector". (We want $W^\perp$ to be a subspace.)
Or looked at categorically: the zero space is an initial object in the category of vector spaces, which makes it analogous, in some sense, to the empty set (the initial object for Set). In Set, "complement" implies $A\cap B=\varnothing$; so in Vect it should imply $A\cap B=\{\mathbf{0}\}$.