Is $\{x\in\mathbb{R^2} | \langle Ax,x \rangle =1\}$ compact

80 Views Asked by At

Is $S = \{x\in\mathbb{R^2} | \langle Ax,x \rangle =1\}$ compact in $\mathbb{R^2}$ when:

a.

$A = \begin{pmatrix} 2 & 1 \\ 1 & 2 \end{pmatrix}$

b.

$A = \begin{pmatrix} 1 & 2 \\ 2 & 1 \end{pmatrix}$

(inner products and distance are the standard ones in $\mathbb{R^2}$).

I feel that this should be relatively simple, but I can't seem to solve this. The most promising approach seems to be showing that it is closed and bounded (other approaches, such as showing that each sequence has a converging subsequence seem less promising since they do not use the fact that this is $\mathbb{R^2}$ and it seems clumsy too, but I may be wrong).

For the bounded part, I thought of taking $x,y\in S$, and then showing that: $||x-y||_2 < \infty$ by somehow transitioning to the inner product to use the facts about $A$, i.e.

$||x-y||_2 = \langle x-y, x-y \rangle = $ [some function of $\langle Ax,x \rangle$ and $\langle Ay,y \rangle$ which would hopefully lead to some insight, but after messing about with these I can't seem to get anywhere interesting].

Since this did not work, and the values in $A$ clearly have to be used somehow, I resolved to spelling out the dot product to obtain that (for a.):

$\langle Ax,x \rangle = 2x_1^2 +2x_1x_2 + 2x_2^2 = 1$

but this didn't provide any striking insight in relation to $||x-y||_2$ and it is obviously not the cleanest approach...

I'm pretty rusty on inner products etc. so I reckon I'm missing something a lot simpler here. Also, I'm completely at a loss as to how the show this is closed (assuming the closed + bounded approach is relevant).

Any advice would be much appreciated!

3

There are 3 best solutions below

0
On BEST ANSWER

Since $\langle Ax,x\rangle = 2x_1^2 + 2x_1x_2 + 2x_2^2=1$, you can define the function $$f:\mathbb{R}^2\to\mathbb{R},\ \ \ f(x_1,x_2)= 2x_1^2 + 2x_1x_2 + 2x_2^2$$ This function is obviously continuous (it just involves sums, products and powers). If we call $H=\{(x_1,x_2)\in\mathbb{R}:\langle Ax,x\rangle=1\}$, then you can also write $$H=\{(x_1,x_2)\in\mathbb{R}:f(x_1,x_2)=1\} = f^{-1}(\{1\})$$ Since $\{1\}$ is a closed set in $\mathbb R$, $f$ being continuous implies that its preimage, $H$, is also closed.

We are left now checking if our set $H$ is bounded. This proof works for diagonalisable matrices, which is the case for both of your matrices.

Let $\lambda_1, \dots,\lambda_n$ be the eigenvalues of a matrix A $n \times n$ (might not be unique) and $v_1,\dots,v_n$ an associated orthonormal basis made of eigenvectors. We can write $x\in\mathbb R^n$ as $$x=\sum_{i=1}^na_iv_i, \ \ \ a_i\in\mathbb R$$ We also have that $$Ax = \sum_{i=1}^n\lambda_ia_iv_i, \ \ \ \langle Ax,x\rangle = \sum_{i=1}^n\lambda_ia_i^2, \ \ \ ||x||^2=\sum_{i=1}^na_i^2$$

The third equality tells us that $H$ will be bounded if the coordinates $a_i$ will be bounded for all $x\in H$. Let's look at our particular matrices.

  1. The first matrix has eigenvalues 1 and 3. This implies that $$1 = \langle Ax,x\rangle = a_1^2 + 3a_2^2 \geq 0$$ Here both coordinates are bounded by 1, so all $x\in H$ are bounded, thus H is compact.
  2. The second matrix has eigenvalues $-1$ and 3. Then $$1 = \langle Ax,x\rangle = -a_1^2 + 3a_2^2 \Longleftrightarrow a_1^2 = 1 - 3a_2^2$$ Using this equation you can pick, for example, $a_2$ as large as you want, and you will be always able to find a value for $a_1$ that satisfies the relation required by the definition of $H$. Then, $H$ is not bounded, so it is not compact.

This is an "easier" proof than Ivo's and Lee's, but as you can see is way longer than theirs, which are more general than mine and are well worth understanding.

8
On

The set $\{x \in \Bbb R^n \mid \langle Ax,x\rangle = 1\}$, where $A\colon \Bbb R^n \to \Bbb R^n$ is linear (and non-singular) and $\langle\cdot,\cdot\rangle$ is the standard inner product, is compact if and only if $A$ is positive-definite or negative-definite.

Sylvester's criterion says that a symmetric matrix is positive-definite if and only if all of its principal minors are positive. In the case $n=2$, it suffices to look at the determinant: if positive, the matrix is definite, while if negative, the matrix is indefinite.

Since the determinants of the matrices given in items (a) and (b) are $3$ and $-3$, respectively, the answers are "yes" and "no".

0
On

If you instead had a matrix of the form $A = \begin{pmatrix} a & 0 \\ 0 & c \end{pmatrix}$, and so you were looking at the solution set of $ax^2 + cx^2 = 1$, then you would know the answer: if $a,c$ are both positive then you get an ellipse which is compact; if both are negative then you get the empty set which is compact; if $a,c$ are nonzero and have opposite signs then you get a hyperbola which is noncompact; if one of $a$ or $c$ is zero and the other is positive you get a parabola; and I'm sure you can enumerate the remaining cases when one or both of $a,c$ are zero. All of the topology in this problem goes into the simple proofs of compactness or noncompactness for those particular kinds of sets: ellipses, hyperbolas, parabolas, etc.

The rest of the problem requires some linear algebra, in order to determine what to do for an arbitrary symmetric matrix $A = \begin{pmatrix} a & b \\ b & c \end{pmatrix}$ of real numbers. The main theorem is that $A$ is diagonalizable, meaning that there exists an invertible matrix of real numbers $B$ such that the matrix $A' = B^{-1}AB$ is diagonal. From this, with a bit of calculation, you can work out that the solution set of $\langle Ax,x\rangle = 1$ is equal to the image of the solution set of $\langle A'x,x \rangle = 1$ under the matrix $B$ acting as an invertible linear transformation on $\mathbb R^2$, and these solution sets have the same type: both ellipses; or both hyperbolas; etc. Furthermore, you can tell which type it is by knowing the corollary that the eigenvalues of $A$ and $A'$ are identical, in particular they have the same signs: either $A$ and $A'$ both have two positive eigenvalues; or they both have one positive and one negative eigenvalue; etc.

So to solve your problem, you just have to determine the signs of the eigenvalues of your matrices $A$.