Checking whether an operator is self-adjoint. Problem with domain of an operator.

775 Views Asked by At

I want to check whether the position operator $A$, where $Af(x)=xf(x)$ , is self-adjoint. For this to be true it has to be Hermitian and also the domains of it and its adjoint must be equal. The Hilbert space I'm working with is of course $L^2(\mathbb{R}) $ with the natural inner product. The problem I'm having is with checking the domains, the definition of the adjoint domain is extremely unwieldy. Here are the definitions I'm using.

The domain of a linear operator is defined thusly: $$D(A) =\{ f \in H : Af \in H\}.$$

The domain of its adjoint (and subsequently the adjoint itself) is defined through: $$D(A^*) = \{ f \in H : \exists f_1 : \forall g \in H, \ (f,Ag)=(f_1,g) \}.$$ The adjoint is then given by $A^*f = f_1$.

I'm not very comfortable with this. Verifying that my operator is Hermitian boils down to just writing down an integral, but I now I have no idea how to go about comparing the two domains to establish that $D(A)=D(A^*)$.

The domain of $A$ I've found to be the set of all functions such that $\int_{\mathbb{R}}dx \ x^2f^2(x)$ exists. But writing down the definition for the domain of the adjoint gives me:

$$D(A^*)=\{ f \in L^2(\mathbb{R}) : \exists f_1 :\forall g : \int_{\mathbb{R}}dx \ x f^*(x)g(x) = \int_{\mathbb{R}}dx \ f_1^*(x)g(x) \ \}.$$

(here $f_1$ and $g$ also belong to $L^2(\mathbb{R})$).

I'm supposed to conclude that the two sets are equal but I don't know how. Any help would be greatly appreciated.

2

There are 2 best solutions below

1
On BEST ANSWER

Since $A$ is densely defined, we can characterise $D(A^\ast)$ as

$$\begin{align} D(A^\ast) &= \left\lbrace f \in H : g \mapsto (f, Ag) \text{ is continuous}\right\rbrace\\ &= \left\lbrace f\in H : \bigl(\exists K_f < \infty\bigr)\bigl(\forall g\in D(A)\bigr)\bigl(\lvert (f,Ag)\rvert \leqslant K_f\cdot \lVert g\rVert\bigr)\right\rbrace. \end{align}$$

We know that $D(A) \subset D(A^\ast)$, and need only see the reverse inclusion. Thus let $f \in D(A^\ast)$. For $n \in \mathbb{Z}^+$, let

$$g_n(x) = \begin{cases} x\cdot f(x) &, \lvert x\rvert \leqslant n\\ \quad 0 &, \lvert x\rvert > n.\end{cases}$$

Then $g_n \in D(A)$, and

$$\begin{align} (f,Ag_n) &= \int_{\mathbb{R}} f(x) \cdot\overline{x\cdot g_n(x)}\,dx\\ &= \int_{-n}^n f(x) x^2\overline{f(x)}\,dx\\ &= \int_{-n}^n x^2\lvert f(x)\rvert^2\,dx\\ &= \int_{\mathbb{R}} \lvert g_n(x)\rvert^2\,dx\\ &= \lVert g_n\rVert^2\\ &\leqslant K_f\cdot \lVert g_n\rVert. \end{align}$$

So we must have $\lVert g_n\rVert \leqslant K_f$ for all $n$, and hence

$$\int_{\mathbb{R}} \lvert x\cdot f(x)\rvert^2\,dx \leqslant K_f^2 < \infty,$$

and that means $f \in D(A)$.

0
On

It's easy to see that $A$ is symmetric on its dense domain, which means that $A^{\star}$ extends $A$. In order to show that $A$ is selfadjoint, it is enough to show that $(A\pm iI)$ are surjective, which is something that is trivial to demonstrate in this case. Indeed, if $f \in L^{2}$, then $(x\pm i)^{-1}f$ are in $L^{2}$ with their respective images under $(A\pm iI)$ equal to $f$.

So, suppose $A : \mathcal{D}(A)\subseteq X\rightarrow X$ is densely-defined on the linear domain $\mathcal{D}(A)$ and that $(A\pm iI)$ are surjective. To show that $A$ is selfadjoint, suppose that $y \in \mathcal{D}(A^{\star})$. Then, because $A-iI$ is surjective, there exists $z \in \mathcal{D}(A)$ such that $$ (A^{\star}-iI)y=(A-iI)z. $$ Consequently, for all $x \in \mathcal{D}(A)$, $$ ((A+iI)x,y)=(x,(A^{\star}-iI)y)=(x,(A-iI)z)=((A+iI)x,z),\;\;\; x \in \mathcal{D}(A). $$ Because $(A+iI)$ is also surjective, then $y=z$, which implies that every $y \in \mathcal{D}(A^{\star})$ must be in $\mathcal{D}(A)$. So $A=A^{\star}$.