As given in the title, there is a matrix given, namely:
$A = \begin{bmatrix}-1 & 0& 0& 0\\1& -1& 0& 0\\1& 1& -1& 0\\1& 1& 1& -1\end{bmatrix}$
Obviously non-singular. The question is to find a matrix $E$ with the smallest value of $\bar{\sigma}(E)$ (smallest maximum singular value) such that $A+E$ is singular.
Intuitively, I would suggest a zero matrix with a $1$ somewhere on the diagonal, thus $\bar{\sigma}(E)=1$. However, I cannot prove this or cannot find a method to show this is true except for somewhat trivial solutions. Can you help me with this problem?
Let $A=U\Sigma V^T$ be the singular value decomposition of $A.$
Let $\sigma$ be the smallest singular value of $A$ and $E=-UV^T\sigma .$ Then $\sigma$ is the maximum singular value of $E$ and $$ A+E = U\Sigma V^T - UV^T\sigma = U(\Sigma - \sigma I)V^T $$ which is obviously singular, because one of the diagonal elements of the diagonal matrix $(\Sigma - \sigma I)$ is $0.$
This is no proof that this is the best choice, but it yields the same results as Robert Israel's numerical approach.