Understanding Higher Dimensional Solution Set

269 Views Asked by At

I'm interested in understanding the solution set of the following simultaneous equations (as well as higher dimensional generalizations with up to 20 variables) for $x_1,~x_2,~x_3,~x_4 \in \mathbb{R}$: \begin{align} x_1^2 + x_2^2 + x_3^2 + x_4^2 = 1\\ x_1 x_2 + x_3 x_4 = 0 \end{align} In particular, I'd like to know if the solution set is path-connected or not.

I considered a brute force approach of testing many points near a known solution to see if there is another solution nearby, but this would be insufficient to establish path connectedness.

I can also try solving the system in special cases, like when $x_4 = 0$, but stitching together these solutions to make a complete picture is difficult.

I feel like I don't know the right tools to attack this problem. What kind of mathematics would be used to attack this problem? (ex. algebraic geometry maybe? - I only have a vague idea of what it's about...). Are there any books / pdfs you can recommend that show how to attack these kinds of problems?

EDIT 1: Just realized that for this particular case, the first equation is not that interesting. This is because scaling of a solution $x$ to the second equation by a factor of $a \in \mathbb{R}$ keeps us inside the solution space of the second equation: \begin{align} x_1 x_2 + x_3 x_4 = 0 \implies a^2 (x_1 x_2 + x_3 x_4) = 0 \implies (ax_1) (ax_2) + (ax_3) (ax_4) = 0 \end{align} So if a solution to the second equation $x$ fails to satisfy the first equation, we can immediately construct a scaled solution $x/b$ that does satisfy both equations: \begin{align} x_1^2 + x_2^2 + x_3^2 + x_4^2 = b^2 \implies 1/b^2 ( x_1^2 + x_2^2 + x_3^2 + x_4^2)= 1 \\ \implies (x_1/b)^2 + (x_2/b)^2 + (x_3/b)^2 + (x_4/b)^2 = 1 \end{align} I need to think some more about what this means for the solution space of both equations, though...

1

There are 1 best solutions below

5
On BEST ANSWER

Let's introduce a change of coordinates. Let's define $$ (y_1,y_2,y_3,y_4) = \left(\frac{x_1+x_2}{\sqrt{2}},\frac{x_1-x_2}{\sqrt{2}},\frac{x_3+x_4}{\sqrt{2}},\frac{x_3-x_4}{\sqrt{2}}\right), $$ note that this is an invertible linear transformation. Now $$ \begin{split} x_1^2+x_2^2+x_3^2+x_4^2 &= \left(\frac{y_1+y_2}{\sqrt{2}}\right)^2 + \left(\frac{y_1-y_2}{\sqrt{2}}\right)^2 + \left(\frac{y_3+y_4}{\sqrt{2}}\right)^2 + \left(\frac{y_3-y_4}{\sqrt{2}}\right)^2 \\ &= y_1^2+y_2^2+y_3^2+y_4^2, \end{split} $$ so $x_1^2+\dotsb+x_4^2=1$ if and only if $y_1^2+\dotsb+y_4^2=1$. Your first equation stays "the same".

For the second equation we see $$ x_1 x_2 + x_3 x_4 = \frac{(y_1+y_2)(y_1-y_2)}{2} + \frac{(y_3+y_4)(y_3-y_4)}{2} = \frac{1}{2}(y_1^2-y_2^2+y_3^2-y_4^2). $$ Therefore $x_1 x_2 + x_3 x_4 = 0$ if and only if $$ y_1^2 - y_2^2 + y_3^2 - y_4^2 = 0 . $$ Our system of equations is $$ \left\{ \begin{align} y_1^2 + y_2^2 + y_3^2 + y_4^2 &= 1 \\ y_1^2 - y_2^2 + y_3^2 - y_4^2 &= 0 \end{align} \right. $$ Adding and subtracting the equations, this is equivalent to $$ \left\{ \begin{align} y_1^2 + y_3^2 &= \frac{1}{2} \\ y_2^2 + y_4^2 &= \frac{1}{2} \end{align} \right. $$ And now the solution set is evidently a product $S^1 \times S^1$, a product of a circle in the $y_1 y_3$-plane and a circle in the $y_2 y_4$-plane. In particular it is path-connected.

I believe that this is often called simultaneous diagonalization of quadrics: for any two quadratic forms, under suitable conditions, there is a linear change of variables that simultaneously diagonalizes them, i.e., writes both quadratic forms as linear combinations of squares of coordinates (no cross-terms). See for example Simultaneous diagonlisation of two quadratic forms, one of which is positive definite, Simultaneous diagonalization of quadratic forms.

If your higher dimensional generalizations are still involving two quadratic equations (just with more variables) then you might be able to tackle them with this and nothing more.

If you have more than two equations, or higher degree than quadratic equations, then you might need to bring in more machinery. Probably someone else has a better idea, but all I can think of is to read a little about algebraic geometry. And you are probably most interested in a perspective that emphasizes practical computations with explicit polynomials (as opposed to, say, coordinate-free approaches). Therefore I am going to suggest the following book:

Ideals, Varieties, and Algorithms, by David Cox, John Little, and Donal O'Shea http://dacox.people.amherst.edu/iva.html

This is an excellent (award-winning!) undergraduate introduction to computational commutative algebra and algebraic geometry.

There are many, many other books, online notes, etc. Perhaps someone else can suggest a source that manages to go all the way from computations with explicit polynomials, to topology of varieties.