Zero element in an Hilbert space is orthogonal?

1.1k Views Asked by At

I have a doubt about Hilbert space, if I have an orthonormal basis, I can't find any other orthonormal vector in the Hilbert space. But zero element is orthogonal to every element of the space, and obviously it is in the space, how is it possible?

1

There are 1 best solutions below

0
On BEST ANSWER

This is a question of unwinding the definitions, and also a matter of stating a theorem precisely.

Definition of orthogonal: vectors $\mathbf{v}$ and $\mathbf{w}$ are orthogonal iff $\langle \mathbf{v}, \mathbf{w}\rangle=0$. This is clearly true if $\mathbf{v}=0$ and $\mathbf{w}$ is any vector whatsoever, so the zero vector is orthogonal to all vectors.

Definition of a span $[\mathbf{v}_1,\ldots,\mathbf{v}_n]$: the set of all vectors that can be expressed in the form $a_1\mathbf{v}_1+\cdots+a_n\mathbf{v}_n$, for scalars $a_1,\ldots,a_n$. (For simplicity let's deal only with the finite dimensional case.) Clearly the zero vector is in the span, just by taking all the $a_i$'s equal to 0.

Now the theorem you seem to have implicitly in mind is this: if $\mathbf{w}$ is orthogonal to all the vectors $\{\mathbf{v}_1,\ldots,\mathbf{v}_n\}$, and $\mathbf{w}$ is in the span $[\mathbf{v}_1,\ldots,\mathbf{v}_n]$, then $\mathbf{w}=\mathbf{0}$. (True provided the inner product is positive definite, which is part of the definition for Hilbert spaces.) This doesn't contradict either definition. Corollary: any vector orthogonal to all the vectors of a basis is zero.

I suppose someone might carelessly state this result as, "No vector can be orthogonal to all vectors of a basis", rather than "Only the zero vector can be orthogonal to all the vectors of a basis". The careless version is false.

Another example, suggested by one of your comments: $W\cap W^\perp=\{\mathbf{0}\}$. Someone might carelessly say, "A subspace and its orthogonal complement are disjoint", rather than "The intersection of a subspace and its orthogonal complement consists solely of the zero vector".

Let me use this question as an excuse to ramble on about mathematical exposition in general. It's a special case of the expert problem: if you thoroughly understand a subject, it's hard to put yourself in the place of a novice. Imprecise, technically incorrect statements whose overall sense is true flow easily from the lips or the fingers, especially if the correct version is more awkward or verbose. I think learning to navigate around these "abuses of language" is part of what people mean by "mathematical maturity". A really great expositor knows how to avoid strewing these pebbles (or boulders) in the path of the reader, without sacrificing an easy conversational style.

Another commenter suggested defining "orthogonal" as a relation between nonzero vectors, so saying $\mathbf{v}$ and $\mathbf{w}$ are orthogonal would automatically imply that $\mathbf{v}$ and $\mathbf{w}$ are both nonzero. I don't like this, for three reasons. First, it disagrees with standard terminology. Second, getting used to the special role of the zero vector is part of getting past the novice stage. Third, the standard definition is logically "cleaner". For example, $W^\perp$ is defined as the set of all vectors that are orthogonal to all vectors in $W$. But with this modified definition, you have to add, "plus the zero vector". (We want $W^\perp$ to be a subspace.)

Or looked at categorically: the zero space is an initial object in the category of vector spaces, which makes it analogous, in some sense, to the empty set (the initial object for Set). In Set, "complement" implies $A\cap B=\varnothing$; so in Vect it should imply $A\cap B=\{\mathbf{0}\}$.