As you know that is enough negating below of diagonal to inverse of lower triangular identity matrix.
example
$$A = \left(\begin{matrix} 1 & 0 & 0 & 0 \\ 3 & 1 & 0 & 0 \\ -1 & 0 & 1 & 0 \\ 2 & 0 & 0 & 1 \\ \end{matrix}\right) $$
basically inverse of A
$$A' = \left( \begin{matrix} 1 & 0 & 0 & 0 \\ -3 & 1 & 0 & 0 \\ 1 & 0 & 1 & 0 \\ -2 & 0 & 0 & 1 \\ \end{matrix}\right) $$
I just need to prove it.
my question is not related any software. it is general linear algebra question. It's not enough to say that "if $A'=$ (inversion of $A$), multiplication $A'$ and $A$ should be $I$ (identity matrix)". we cannot say for all case. I need a general proof.
it's related topic with Gauss Elimination - LU decomposition
thank you for any help.
I don't think your statement is correct. For example, $$ \pmatrix{ 1\\ 1&1\\ 0&1&1\\ 0&0&1&1 }^{-1} = \left( \begin{array}{rrrr} 1\\ -1&1\\ 1&-1&1\\ -1&1&-1&1 \end{array} \right) $$ However, if only the first column is non-zero, then we can write our matrix in the form $$ A = \pmatrix{ 1&0\\ x&I_3 } $$ where $x \in \Bbb R^3$ and $I_3$ is the size $3$ identity matrix. We then note using block-matrix multiplication that $$ \pmatrix{ 1&0\\ x&I_3 } \pmatrix{ 1&0\\ -x & I_3} = \pmatrix{1&0\\0&I_3} = I_4 $$ so that indeed, we can find the inverse by negating whatever is below the diagonal.