Suppose $ a, b, c, d $ are real numbers and $ r $ is a constant. Which values can $ r $ take to make the following set a subring of ring of $ 2\times 2 $ matrices: $$ \left\{ \begin{bmatrix}a&b\\c&d\end{bmatrix}: a+b+c+d=r \right\} .$$
I was thinking one possibility that $ r=0 $, but I had trouble with showing the multiplication is closed. A little help here?
According to the post below, I'd like to add a paragraph:
Actually, the only possibility for $ r $ is $ 0 $, since the additional identity $ 0 $ must stay in there.
$$ \begin{bmatrix}a&b\\c&d\end{bmatrix}\cdot \begin{bmatrix} a^{'} & b^{'}\\c^{'} &d^{'}\end{bmatrix}=\begin{bmatrix} aa^{'}+bc^{'} &ab^{'}+bd^{'}\\ca^{'}+dc^{'}&cb^{'}+dd^{'}\end{bmatrix} $$
In fact, the set can only be a subring, if it is an additive subgroup first. So the zero matrix should be in the set. This necessarily means $r=0$. Now multiply two such matrices. What do you get? Try to find a counterexample with specific matrices.
More explicitly, try $$ A=\begin{pmatrix} 10 & 25 \cr 0 & -35 \end{pmatrix} B=\begin{pmatrix} 1 & 1 \cr -1 & -1 \end{pmatrix} $$ Then $AB$ is no longer in the set.