I am trying to solve the following problem:
Let $u_1$ and $u_2$ be two orthogonal vectors in ${\rm I\!R}^n$ and set $a_1 = u_1$, $a_2 = u_1 + \varepsilon u_2$ for $\varepsilon>0$. Let also $A$ be the matrix with columns $a_1$ and $a_2$ and $b$ a vector linearly independenet of $a_1$ and $a_2$. Least square solution is discussed here to the system $Ax = b$ as $\varepsilon\to0$.
(a) Find the matrix $A^\top A$, its inverse, and then $\hat{x} = > (A^\top A)^{-1}A^\top b$ explicitly. Show that $\hat{x}$ explodes as $\varepsilon\to0$
(b) Find the projection $A\hat{x}$ of $b$ onto $\operatorname{col}(A)$ and check that it does not depend on $\varepsilon>0$. Explain the result.
I have assumed, that $A = \begin{pmatrix}u_1 & u_1+\varepsilon u_2\end{pmatrix}$, therefore:
$A^TA=\begin{pmatrix}u_1 \\ u_1+\varepsilon u_2\end{pmatrix}\begin{pmatrix}u_1 & u_1+\varepsilon u_2\end{pmatrix}=\begin{pmatrix}u_1^2 & u_1(u_1+\varepsilon u_2)\\u_1(u_1+\varepsilon u_2) & (u_1 + \varepsilon u_2)^2\end{pmatrix}$
But when I try to compute $(A^TA)^{-1}$, determinant becomes zero, what means that matrix is not invertible. What am I doing wrong? Thanks in advance for any hints!
Bear in mind what it means to multiply vectors in the first place. Your determinant is $$(u_1\cdot u_1) (u_1\cdot u_1 + 2\varepsilon u_1\cdot u_2+\varepsilon^2 u_2\cdot u_2)-(u_1\cdot u_1 + \varepsilon u_1\cdot u_2)^2=\varepsilon^2 (u_1\cdot u_1 u_2\cdot u_2-(u_1\cdot u_2)^2).$$