I need to show that the embedding of $H^1(\mathbb{R}^N)$ in $L^2(\mathbb{R}^N)$ is not compact.
I need to find a sequence $(u_n)\subset H^1(\mathbb{R}^N)$ bounded such that there is no subsequence that converges in $L^2(\mathbb{R}^N)$.
I took $\varphi\in C^\infty_c(\mathbb{R}^N)$ such that the $S=$supp$(\varphi)\subset B_1(0)$ and defined $u_n(x)=\varphi(x+x_n)$, where $|x_n|\rightarrow\infty$ is a sequence in $\mathbb{R}^N$.
First one needs to show that $(u_n)$ is bounded in $H^1(\mathbb{R}^N)$. Indeed, \begin{eqnarray} |u_n|^2&=&\int_{\mathbb{R}^n}|\varphi(x+x_n)|^2+\sum_{i=1}^N|\varphi_{x_i}(x+x_n)|^2dx\\ &=&\int_{\mathbb{R}^n}|\varphi(x)|^2+\sum_{i=1}^N|\varphi_{x_i}(x)|^2dx\\ &=&|\varphi|^2. \end{eqnarray} So $(u_n)$ is bounded.
First question: Is this correct? Is there any detail missing?
Second question: To prove that there is no convergent subsequence, I can show that there is no Cauchy subsequence. I saw in a book that I just need to use the support of $u_n$ and $u_m$. How to proceed?
Your boundedness in $H^1$ is ok.
If the support of $u_n$ does not intersect the support of $u_m$, they are orthogonal. In fact, you have \begin{align*}\|u_n - u_m\|_{L^2}^2 &= \int_{\mathbb{R}^n} |u_n(x) - u_m(x)|^2 \, \mathrm{d}x = \int_{\mathbb{R}^n} |u_n(x)|^2 + |u_m(x)|^2 \, \mathrm{d}x \\&= \|u_n\|_{L^2}^2 + \|u_m\|_{L^2}^2 = \mathrm{const}.\end{align*}