Image of a dense set through unbounded operator

489 Views Asked by At

Let $T$ be a densely defined, closed operator on a Hilbert space $H$ such that $T^*T$ remains densely defined. Obviously, $\sigma(I+T^*T)\subset [1,\infty)$, which in particular implies this operator (is surjective and) has a bounded inverse $A$.

Now, what I'm wondering is: If we let $\mathcal{D}$ be a dense/total subset of $H$ which sits inside the domain of $I+T^*T$, does it follow that $(I+T^*T)(\mathcal{D})$ remains a dense/total subset?

It seems very plausible, given how ``close'' this operator is to being bounded and invertible; however, I cannot find a proof of this.

A proof of counterexample would be most welcome.

1

There are 1 best solutions below

9
On BEST ANSWER

Your conjecture is false.

It is always the case that $T^{\star}T$ is densely-defined and selfadjoint if $T$ is a closed densely-defined linear operator on a Hilbert Space $H$.

Let $H=L^2[0,1]$ and let $\mathcal{AC}[0,1]$ be the absolutely continuous functions on $[0,1]$. Define $T=\frac{d}{dt}$ on the domain $$ \mathcal{D}(T)=\{ f \in \mathcal{AC}[0,1] : f'\in L^2[0,1],\;\; f(0)=0 \}. $$ Then $T^{\star}=-\frac{d}{dt}$ on the domain $$ \mathcal{D}(T^{\star})=\{ f \in \mathcal{AC}[0,1] : f' \in L^2[0,1],\;\; f(1)=0 \}. $$ This basically follows because $(f',g)-(f,g')=f\overline{g}|_{0}^{1}$.

The operator $T^{\star}T$ has a domain $\mathcal{D}(T^{\star}T)$ consisting of all $f \in \mathcal{AC}^2[0,1]$ (twice absolutely continuous functions) for which $f(0)=f'(1)=0$.

Define a new operator $S$ as the restriction of $T^{\star}T$ to $$ \mathcal{D}(S) = \{ f\in\mathcal{D}(T^{\star}T) : f(0)=f'(0)=f(1)=f'(1)=0 \}. $$ Then $S$ is a closed operator, and the domain is dense in $H$ because the subspace $\mathcal{C}_c^{\infty}(0,1)$ of compactly supported infinitely-differentiable functions is dense in $H$ and contained in $\mathcal{D}(S)$. Because of the endpoint conditions, integration by parts gives $$ (Sf,e^{t})=(-f'',e^{t})=-(f,e^{t}),\;\;\; f\in\mathcal{D}(S) \\ ((I+T^{\star}T)f,e^{t})=0,\;\;\;f \in \mathcal{D}(S). $$ Therefore $(I+T^{\star}T)\mathcal{D}(S)$ cannot be dense in $H$ even though $\mathcal{D}(S)\subset \mathcal{D}(T^{\star}T)$ is dense in $H$.