I need to inverse a matrix $A$ given its $QR$ decomposition. It's a numerical task.
It is told that the inversion should be "possibly cheap". But it does not look like I can do something more efficient than computing $R^{-1}$ and multiplying (if one needs a matrix, not a product of two matrices) it by $Q^{T}$. However, even if a product is OK (no multiplication needed), I'm still ineffective, because I don't know how to inverse a triangular matrix faster than in cubic time.
Are there any tricky algorithms doing that (I'm not asking about something like fast matrix multiplication, it's a stupid homework) or the task only sounds wise but all I have to do is to invert a triangular matrix as it is taught in linear algebra course or using back substitution?
EDIT: I emphasise that the task is to invert a matrix, not to find a solution of a linear system.
Any $N \times N$ triangular system can be solved in $\mathcal{O}(N^2)$.
For instance, if it is an upper triangular system, start from the last equation $(N^{th}$ equation) which requires only one division to get the $N^{th}$ unknown. Once you have this go to the previous equation $((N-1)^{th}$ equation) which requires only one multiplication, one subtraction and one division to get the $(N-1)^{th}$ unknown. Go to the $(N-2)^{nd}$ equation which requires $2$ multiplications, $2$ subtractions and $1$ division. In general, the $(N-k)^{th}$ equation requires $k$ multiplications, $k$ subtractions and $1$ division. Hence, the total cost is $$1+2+3+\cdots+(N-1) = \frac{N(N-1)}{2} \text{ multiplications}$$ $$1+2+3+\cdots+(N-1) = \frac{N(N-1)}{2} \text{ subtractions}$$ $$N \text{ divisions}$$
The same idea works for lower triangular systems as well (in which case you start from the first equation and proceed all the way down).
In fact, the idea behind all matrix decomposition algorithms is to make the solving part cheaper so that given a linear system even if the right hand side were to change you can solve it in a relatively inexpensive way once you have the decomposition of the matrix. The two main decomposition algorithms which are used satisfy this requirement.
Other decomposition algorithms like the SVD where $A = U \Sigma V^T$ where $U$ and $V$ are orthonormal and $\Sigma$ is a diagonal also satisfy the requirement. Once we have $A = U \Sigma V^T$, solving $Ax = b$ is equivalent to solving $Uy = b$, whose solution is given by $y = U^Tb$ and costs $\mathcal{O}(N^2)$, $\Sigma z = y$, which can be easily inverted since $\Sigma$ is just a diagonal matrix and hence costs $\mathcal{O}(N)$, and $V^Tx = z$, whose solution is given by $x = Vz$ and costs $\mathcal{O}(N^2)$.
EDIT In case you want the inverse of the lower triangular operator, you proceed as follows. (In numerical linear algebra it is one of the cardinal sins to find the inverse explicitly. In any application you will never need to find the inverse explicitly.) $$L = \begin{pmatrix}1 & 0 & 0 & 0 & \cdots & 0 \\ l_{21} & 1 & 0 & 0 & \cdots & 0 \\ l_{31} & l_{32} & 1 & 0 & \cdots & 0 \\ l_{41} & l_{42} & l_{43} & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \vdots & \ddots & \vdots \\ l_{n1} & l_{n2} & l_{n3} & l_{n4} & \cdots & l_{nn} \\ \end{pmatrix}$$ We can write $L$ as $$L = L_1 L_2 L_3 \cdots L_{n-1}$$ where $$L_k = \begin{pmatrix}1 & 0 & 0 & 0 & \cdots & 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & \cdots & 0 & 0 & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots\\ 0 & 0 & \cdots & 1 & \vdots & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & \cdots & l_{k+1,k} & 1 & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & \cdots & l_{k+2,k} & 0 & \ddots & \vdots & \vdots & \vdots \\ 0 & 0 & \cdots & l_{k+3,k} & 0 & \cdots & \ddots & \vdots & \vdots \\ 0 & 0 & \cdots & \vdots & 0 & \cdots & \cdots & \ddots & \vdots\\ 0 & 0 & \cdots & l_{n,k} & 0 & \cdots & \cdots & \cdots & 1 \end{pmatrix}$$ Then $$L^{-1} = L_{n-1}^{-1}L_{n-2}^{-1}L_{n-3}^{-1} \cdots L_{1}^{-1}$$ where $$L_k^{-1} = \begin{pmatrix}1 & 0 & 0 & 0 & \cdots & 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & \cdots & 0 & 0 & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots\\ 0 & 0 & \cdots & 1 & \vdots & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & \cdots & -l_{k+1,k} & 1 & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & \cdots & -l_{k+2,k} & 0 & \ddots & \vdots & \vdots & \vdots \\ 0 & 0 & \cdots & -l_{k+3,k} & 0 & \cdots & \ddots & \vdots & \vdots \\ 0 & 0 & \cdots & \vdots & 0 & \cdots & \cdots & \ddots & \vdots\\ 0 & 0 & \cdots & -l_{n,k} & 0 & \cdots & \cdots & \cdots & 1 \end{pmatrix}$$