Having an exam tomorrow in linear algebra. Can someone help me to understand how to do this type of problem solving?
Problem: Let $A$ be one $(n \cdot n)$ matrix and let $R$ be its reduced step mode. Suppose that $R$ is not the identity matrix. Show that $(R)=0$.
I can't understand how I should work with this type of problem if its not an identity matrix.
Do you mean that $R$ is the reduced row echelon form of $A$ and $\det R=0?$
Matrix $R$ is already upper triangular, so if it is different from the identity matrix, then it has zeros on the tail of the main diagonal (why$?$). Thus $\det R=0$, since the determinant of a triangular matrix equals the product of its diagonal entries (do you know who to expand a determinant along a row or column with few non-zero entries$?$)