Why the addition of the noise will almost certainly make the square matrix invertible?
The matrix is invertible if its determinant is not equal to $0$. The determinant of the matrix geometrically can be understood as a $n$-dimensional parallelepiped defined by its row/column
vectors.
Now let our non-invertible matrix $A$ be for simplicity $3 \times 3$ matrix. Obviously, $det(A) = 0$ i.e. the parallelepiped formed by the rows/columns of $A$ is flat. If we add some noise matrix $V$ where $v_{ij}$~$Norm(0,a)$ to each entry of $A$ i.e. $A+V$, we will change the the rows/columns of $A$. How can one show geometrically (or any way) that the new rows/columns would be independent?
Denote by $V$ your random matrix of $v_{i,j}$'s. Note that $\textrm{det}(A+V)$ is a polynomial in the entries $V$.
Appealing to the implicit function theorem, the level sets of $B\mapsto \textrm{det}(A+B)$ are co-dimension $1$ manifolds (in $\mathbb{R}^{n^2}$) and hence, are Lesbegue null-sets. Since $V$ is Gaussian and hence, is absolutely continuous with respect to the Lesbegue measure, for any $x\in \mathbb{R}$, we have that $\mathbb{P}(\textrm{det}(A+V)=x)=0$. This holds, in particular, for $x=0$ and we conclude that $A+V$ is almost surely invertible.