I have a badly-conditioned square matrix. I need to inverse it. For inversing, currently I'm doing the following steps:
- I take the badly-conditioned matrix with size of $n$ by $n$
- By reduced row echelon form (RREF) I find $r$ linearly-independent columns of badly-conditioned matrix (I have to choose an appropriate tolerance for RREF). After RREF, I know the index of columns and rows which are linearly independent.
- I keep a $r$ by $r$ matrix which contains only linearly-independent columns and rows.
- I inverse the $r$ by $r$ matrix with Cholesky Decomposition (if symmetric positive definite: $A A^{-1}=I$ then $LL^{T}A^{-1}=I$ then $A^{-1}=...$) or LU Decomposition ($AA^{-1}=I$ then $LUA^{-1}=I$ then $A^{-1}=...$).
- Then I have the inverse which is a $r$ by $r$ matrix
- I create a $n$ by $n$ matrix which is all zeros.
- I move the elements of $r$ by $r$ inverse matrix to $n$ by $n$ zero matrix based on the fact that I know index of linearly dependent columns and rows from previous steps.
- Finally, I have a $n$ by $n$ matrix which can be inverse of the original badly-conditioned $n$ by $n$ matrix.
My first question: is the above methodology correct?
My second question: is there any better methodology (faster and more precise)?
Your method is certainly incorrect, as it produces a matrix of rank $r$ rather than $n$, and this can't be the inverse of anything.
You might want to look into the Moore-Penrose pseudo-inverse, which produces something like what you're calculating. Again, you can't call this "the inverse" of your matrix, but for some purposes it can be used instead of an inverse.