Prove the following statement:
There exists an $\epsilon > 0$ such that the following holds:
If $A = (a_{ij}) \in \text{Mat}_{2,2} (\mathbb{R})$ a matrix with $(|a_{ij}|) < \epsilon$ for $i,j \in \{1,2\}$, then the following equation
$$X^2 + X = A$$
has a solution $X \in \text{Mat}_{2,2} (\mathbb{R})$
My Idea on how to solve this:
Let $X = \begin{bmatrix} v& w \\ x & y \end{bmatrix}$. Therefore $X^2 + X = \begin{bmatrix} v^2 + v + w x& v w + w y + w\\ v x + x y + x & w x + y^2 + y \end{bmatrix} = \begin{bmatrix} a_0 & a_1 \\ a_2 & a_3\end{bmatrix}$
Lets now define the function $$ 0=h(v,w,y,x,a_0,a_1,a_2,a_3) =\begin{cases} v^2 + v + w x - a_0 \\ v w + w y + w - a_1\\v x + x y + x -a_2 \\w x + y^2 + y-a_3 \end{cases} $$
We can now calculate the derivative of $h$:
$$dh = \begin{bmatrix} 2v + 1 & x & 0 & w & -1&0&0&0\\ w& v+y+1& w& 0& 0&-1&0&0\\ x & 0&x&v +1 & 0&0&-1&0 \\0&x&2y+1&w& 0&0&0&-1 \end{bmatrix}$$
The idea now would be to apply the implicit function theorem and show that there exists an $X$ which solves this equation. I am not sure though if this approach is correct.
Last but not least.. this question comes from an analysis sheet, so I assume one should use the methods of analysis to solve it.
Is my approach the correct way? And how does one proceed from here?
Feel free to use another approach.
Thank you for your time.
Identify the matrix $X=\begin{pmatrix} x_1 & x_2 \\ x_3 & x_4 \end{pmatrix}$ with the vector $\begin{pmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{pmatrix}.$
The the mapping $X^2 + X$ is equivalent to the following mapping from $\mathbb{R}^4$ to $\mathbb{R}^4$
$$ x = \begin{pmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{pmatrix} \to f(x) = \begin{pmatrix} x_1^2 + x_2x_3+x_1\\ x_1x_2+x_2x_4+x_2\\ x_3x_1 + x_3x_4+x_3\\ x_3x_2+x_4^2 +x_4\\ \end{pmatrix} . $$
The Jacobian matrix $f'(x)$ is a continuous function of $x$ and is given by $$ \begin{pmatrix} 2x_1+1 & x_3 & x_2 & 0\\ x_2 & x_1+x_4+1& 0 & x_2\\ x_3 & 0 & x_1 + x_4 + 1 & x_3 \\ 0 & x_3 & x_2 & 2x_4 + 1\\ \end{pmatrix}. $$
If $\mathbf{0}$ denotes the zero vector, then it is easy to see $f'(\mathbf{0})$ is the identity matrix, and by continuity it follows that there is a open set, $U$ containing the origin on which $f'(x)$ is invertible, and hence it follows (Apostal Mathematical analysis, 13.5 for example) that $f$ is an open mapping on $U$, and in particular the image of $U$ contains a sufficiently small open ball around $f(0) = \begin{pmatrix} 0 \\ 0 \\ 0 \\ 0 \end{pmatrix}.$
So $[ -\epsilon, \epsilon]^4 $ lies in the image of $U$ for sufficiently small $\epsilon > 0$ and the result follows.