I always thought that if the determinant of a matrix $A$ is $0$ then it has no inverse, $(A^{-1})$, until I saw an exercise in Contemporary Abstract Algebra by Gallian. This asks me to prove that the set of the $2\times2$ matrices of the form $$\begin{bmatrix} a&a\\ a&a\\ \end{bmatrix}\,,$$ where $a \neq 0$ and $a \in \mathbb R$.
is a group under matrix multiplication. The determinant of the above set of matrices is $0$, but still the inverse exists for each matrix which is nothing but $$\begin{bmatrix} \frac{a}{2} & \frac{a}{2} \\ \frac{a}{2} & \frac{a}{2} \\ \end{bmatrix}\,.$$
How is this possible? What makes these type of matrices escape from satisfying determinants? What's the logic behind that?
It's a different notion of inverse. In linear algebra, the inverse of a $2\times 2$ matrix $A$ is the matrix $A'$ such that $$A\,A' = \begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}\,,$$ if any such matrix exists. However, the group inverse is the matrix $A''$ such that $$A\,A'' = \begin{pmatrix}i & i \\ i & i\end{pmatrix}\,,$$ for whatever value of $i$ makes $$\begin{pmatrix}x & x\\x & x\end{pmatrix}\,\begin{pmatrix}i & i\\i & i\end{pmatrix} = \begin{pmatrix}x & x\\x & x\end{pmatrix}$$ true for all $x$ (the other answers say that we need $i=\tfrac12$).
$A'$ and $A''$ are clearly different matrices and there's no reason that the existence of $A''$ for a particular matrix $A$ should imply the existence of $A'$.