Let's start with two square matrices, $A$ and $I$. $I$ is the identity matrix in n-dimensions. $A$ is such that all its entries are equal to 1 (clearly, it is not invertible).
Our task is to determine conditions on $a$ and $b$ such that $M=bA + (a-b)I$ is invertible, i.e. has rank = $n$. Of course, we don't know what $n$ is, so we probably don't need to.
$a=b$ is the worst that's possible, so $a$ and $b$ should not be equal. Is that all, though? Is it sufficient to ensure invertibility of $M$?
$A$ has all linearly dependent rows, while $I$ has all linearly independent rows. Is this sufficient to say that $A+I$ (or any other linear combination) has linearly independent rows/columns?
Any help is appreciated, thanks a lot!
$A$ will have eigvenvalue $0$ (as you already noticed since it is not invertible) and this will have geometric multiplicity $n-1$, seen by the rank-nullity theorem noting that the rank of $A$ is clearly $1$ as all rows are identical. The remaining eigenvalue of $A$ is then easily seen to be the trace of $A$, namely $n$ itself with geometric multiplicity $1$, since the trace is equal to the sum of the eigenvalues and all but one of which are zero.
Using said eigenvectors and eigenvalues it follows that $M=bA+(a-b)I$ will have eigenvalues $nb+(a-b)$ with geometric multiplicity $1$ and $(a-b)$ with multiplicity $n-1$.
To see this, let $v$ be an eigenvector of $A$ with eigenvalue $n$. We have that $Mv = (bA+(a-b)I)v = (bA)v+((a-b)I)v$ $= b(Av)+(a-b)(Iv)=bnv+(a-b)v=(bn+a-b)v$. Similarly for those eigenvectors of $A$ which had eigenvalue $0$.
Since a matrix is invertible iff it has no zero eigenvalues so long as neither $nb+(a-b)$ nor $a-b$ are equal to zero, the matrix will be invertible.