Efficiently invert threediagonal symmetric matrix

78 Views Asked by At

I need to invert a threediagonal symmetric $1000\times10000$ matrix (or of similar dimensions). Specifically I need to solve the following equation: $$Ax=b$$ with $$A=\begin{pmatrix}v_1&-\frac{dt}{2}&0&\cdots\\-\frac{dt}{2}&v_2&-\frac{dt}{2}&0&\dots\\0&-\frac{dt}{2}&v_3&-\frac{dt}{2}&0&\dots\\\vdots&0&\ddots&\ddots&\ddots\end{pmatrix}$$ To be even more precise: $$A=\mathbf1-\frac{dt}{2}\cdot(D2+V)\qquad b=(\mathbf1 + \frac{dt}{2} * (D2 + V)) * \psi$$ with $$D2=\begin{pmatrix}-2&1&0&\cdots\\1&-2&1&0&\cdots\\0&1&\ddots&\ddots&0&\cdots\\\vdots&\vdots&\ddots&\vdots&\vdots&\ddots\end{pmatrix}\qquad V=\begin{pmatrix}V_1&0&\cdots\\0&V_2&0&\cdots\\\vdots&\ddots&\ddots&\ddots\end{pmatrix}$$ If you're curious, I need this for solving the one dimensional Schrödinger equation via the Crank-Nicolson iteration.

Does an efficient algorithm exist for solving such a system (or for inverting the matrix)? So far the best I found is the Gaussian elimination, which works in $\mathcal O(n^3)$ - but this would yield a billion iteration steps.

1

There are 1 best solutions below

2
On BEST ANSWER

Just too long for a comment.

You don't need (and don't have to) invert a matrix: just solve the linear system. That said, for tridiagonal matrices you can use Thomas Algorithm, which is just Gaussian Elimination applied to a tridiagonal system. It's composed by two parts: the forward sweep and the backward one and it's easy to implement. You can find all the relevant information in the Wikipedia page