A question related to $S^{\perp}$ and closure of span of $S$

2.4k Views Asked by At

This question was asked in my linear algebra quiz previous year exam and I was unable to solve it.

Let V be an inner ( in question it's written integer , but i think he means inner) product space and S be a subset of V. Let $\bar S$denote the closure of S in V with respect to topology induced by the metric given by inner product. Which of the following statements are true?

A $ S $=$ (S^{\bot})^{\bot}$

B $ \overline S$= $(S^{\perp})^{\perp}$

C $\overline {\text{span}(S)}$=$(S^{\bot})^{\bot}$

D $ S^{\bot} $=$ ((S^{\bot})^{\bot})^{\bot}$

I was completely blank on how can I approach this problem although I have studied linear algebra carefully. Can you please tell on how I should approach the problem .

Edit : I tried It again . I marked A ,D but answer is C,D. If A is false I don't see why D must be true. So, I think I am missing some concepts.

3

There are 3 best solutions below

3
On BEST ANSWER

Recall that $S^\perp$ is defined as the set of all vectors in $V$ which are orthogonal to every vector in $S$.

In case $S$ is a singleton, say $S=\{s\}$, then we have $$ S^\perp = \{x\in V: \langle x, s\rangle =0\}, $$ so $S^\perp$ coincides with the null space of the continuous linear functional $$ x\in V \mapsto \langle x, s\rangle , $$ and for that reason $S^\perp$ is obviously a CLS (closed linear subspace).

For a general set $S$, one clearly has that $$ S^\perp = \bigcap_{x\in S} \{s\}^\perp, $$ so $S^\perp$ is the intersection of a family of CLS's, and hence itself a CLS.

Notice that the right-hand-side of (A), (B), (C) and (D) all refer to the "perp" of something, so are all CLS's.

Since the left-hand-side of (A) and (B) may or may not represent a CLS, then they can't always be true.


Point (C) is true. To see why, first notice that $$ S\subseteq (S^\perp)^\perp \tag 1 $$ for a pretty elementary reason (which nevertheless sounds a bit like a tongue-twister): every vector in $S$ is orthogonal to anything that is orthogonal to every vector in $S$.

We then see that (1) states that $S$ is contained in a CLS, and since $\overline{\text{span}}(S)$ is the smallest CLS containing $S$, it follows that $$ \overline{\text{span}}(S)\subseteq (S^\perp)^\perp $$

To prove the converse inclusion, pick any vector $x$ in $(S^\perp)^\perp$. A well known result about Hilbert spaces (which requires that $V$ be complete, and hence we need to assume it here) states that $x$ may be written as $$ x=u+v, $$ where $u$ is perpendicular to $\overline{\text{span}}(S)$, and $v$ velongs to $\overline{\text{span}}(S)$.

Notice that, in particular, $u$ is perpendicular to every vector of $S$, and hence $u\in S^\perp$.

On the other hand, since both $x$ and $v$ lie in $(S^\perp)^\perp$, we conclude that $u=x-v\in (S^\perp)^\perp$.

This implies that $u\in (S^\perp)^\perp \cap S^\perp$, so $u$ is perpendicular to itself, whence $u=0$ and then $$ x = u+v = v\in \overline{\text{span}}(S). $$ This concludes the proof of (C).


Regarding point (D) it is true even if $V$ is not complete. It is a consequence of the following much more general result:

Lemma. Let $V$ be any set (e.g. the inner-product space of interest here) and let $\lozenge$ be a symmetric relation on $V$ (e.g. $x\mathrel{\lozenge} y \Leftrightarrow x\perp y$). For each subset $S\subseteq V$ define $$ S^\lozenge = \{x\in V: x\mathrel{\lozenge} s, \text{ for all $s$ in S} \}. $$ Then $$ S^\lozenge = ((S^\lozenge)^\lozenge)^\lozenge, $$ for any $S$.

Proof. The tongue-twister above immediately implies that $$ S\subseteq (S^\lozenge)^\lozenge. \tag 2 $$ Plugging in $S^\lozenge$ in place of $S$ in (2), we get $S^\lozenge \subseteq ((S^\lozenge)^\lozenge)^\lozenge$.

Next observe that $$ S_1\subseteq S_2 \Rightarrow S_2^\lozenge\subseteq S_1^\lozenge, $$ and if this is applied to (2), we get $ ((S^\lozenge)^\lozenge)^\lozenge\subseteq S^\lozenge. $ QED

An interesting Corollary, in a totally distinct area of Math is:

Corollary. Given a ring $R$ and any subset $S\subseteq R$, define the commutant of $S$, denoted $S'$, to be the set formed by the elements of $R$ which commute with every element of $S$. Then $S'''=S'$.

2
On

The condition that $S$ is a subset (and is not, for example, given to be a subspace of $V$) is rather odd, and makes some of this problem different from standard linear algebra. In general, in any inner product space, $S^\perp$ is a closed subspace, so the right sides of each of A, B, C, and D are closed subspaces.

However, for each of $A$, $B$, and $D$ (the original version of $D$, and not the current one), it is possible for the left side not to be a closed subspace. (The example given by Peter Franek of when $S$ is a one-element set is likely the simplest.) In general, though, $\operatorname{span}(S)$ should be a subspace, and $\overline{\operatorname{span}(S)}$ should be a closed subspace, so it should seem plausible that C could be true. It in fact is, as it essentially only deals with subspaces --- see if you can prove this.

0
On

$A,B,C$ are false, in general, and $D$ is true. If $V$ is Hilbert, then also $C$ is true.

Let $S$ be any set, then $S^\perp$ is the set of all vectors that are perpendicular to all elements of $S$. That's usually a definition, $S^\perp := \{v\,|\,\,\forall\,s\in S\,\,g(v,s)=0\}$.

You can easily verify that this is a vector subspace. Namely, if $v,w\in S^\perp$ and $\alpha\in \mathbb{R}$, then also $v + \alpha w\in S^\perp$.

So also $(S^\perp)^\perp$ is a vector subspace.

This immediately excludes $A$ and $B$, in cases when $S$ (or $\overline{S}$) is just a subset of $V$ but not a vector subspace. You can find plenty of counter-examples, for instance $S=\{v\}$ for one vector $v\neq 0$.

$C$ is harder. The easy part is $LHS \subseteq RHS$.

Let $v$ be from the LHS. That is, $v$ is a limit of some $v_i$ such that $v_i\in \text{span}(S)$. Each $v_i$ is a combination $\beta_j s_j$ of elements of $S$. If $w\in S^\perp$, then $w$ is orthogonal to all elements of $S$ and hence $g(w, s_i)=0$ and also $g(w, v_i)=0$. Using continuity of $g$, we have $g(w, v) = g(w, \lim_i v_i) = \lim g(w, v_i) = 0$, so $w$ is also orthogonal to $v$. So $v$ is in $(S^\perp)^\perp$.

For the other implication, you need to show that a closed subspace (closure of the span of $S$ here) has an orthogonal complement. This is not always the case: there are proper closed subspaces $Y$ such that $Y^\perp = 0$!

So take such $Y$ as your $S$, then the span of $S$ is $S$ and the closure of $S$ is $S$, and $(S^\perp)^\perp = V \neq S$.

However, this being said, $C$ holds under some additional assumptions, such as $V$ being a Hilbert space.

If $C$ is true, then $D$ follows easily, because it reduces to $$ S^\perp = (\overline{span(S)})^\perp $$ which is obvious.

In general, if $V$ is not complete, then the proof of $D$ is given by the answer of @Ruy.