Operation cost of a lower triangular matrix

2.1k Views Asked by At

I'm having a hard time understanding the cost of the operations. Take this for example:

Considering the system $Tx=b$, with $T=(t_{ij})$ triangular lower matrix.

\begin{cases} t_{11}x_1=b_1 \\ t_{21}x_1+t_{22}x_2 =b_2\\ ....... \\ t_{n1}x_1+t_{n2}x_2+...+t_{nn}x_n =b_n\\ \end{cases} $\Leftrightarrow$ $\sum_{j=1}^{i} t_{ij}x_j=b_i, i=1(1)n$

The solution is obtained with a downward substitution and,

$x_i=(b_i-\sum_{j=1}^{i-1} t_{ij}x_j)/t_{ii},i=1(1)n$

The calculation of $x_i$ implies $i-1$ products and $1$ division.

But from here how do I get to know that the calculation of $x$ costs $\mathcal{O}( \frac{n^2}{2})$ operations?

3

There are 3 best solutions below

2
On BEST ANSWER

You have then $$\sum_{i=1}^n (i-1)=\frac{n(n-1)}{2} $$ products and $n$ divisions : so $$ \frac{n^2}{2}-\frac{n}{2} + n =\frac{n^2}{2}+\frac{n}{2}=O\bigg( \frac{n^2}{2}\bigg)$$ operations.

0
On

Well from there it is quite straight forward that you want to solve for n variables and each one takes one more product than the last. So total number of products and divisions is:

1 + 2 + 3 + 4 + ...

$$ \sum_{i=1}^{n} (i - 1) + n = \frac{n(n - 1)}{2} + n$$ which is $$O(n^2)$$

When you write $O$ notation you do not use constants of proportionality.

0
On

Netchaiev gave you a straight-forward answer. Here is a second way to get the amount of operations.

This way of thinking might be helpful for computer science oriented people, who see algorithms as computer code and not as maths. Also this way is useful, if you have written code to analyse.

The pseudo-code for this algorithm is

for i=1:n
     sum = 0
     for j=1:i
         temp = t_ij * x_j 
         sum = sum + temp
     x_i = b_i/sum

Now the rule is quite easy: $\text{for} = \sum$

$$\sum_{i=1}^n\{ \sum_{j=1}^i [1\odot + 1\oplus] + 1\oslash\} $$ Then the amount of operations is: $$\sum_{i=1}^n\{ 2i + 1\} = \sum_{i=1}^n 2i + \sum_{i=1}^n 1 = 2 \sum_{i=1}^ni + n = 2 \frac{n(n+1)}{2} + n = n^2 + 2n$$

Small remark: In this calculation I separated addition and multiplication. It can be often seen in maths, that you group 1 addition and 1 multiplication (as in $x=a+bc$) to one "arithmetic operation". Also the term "elemental operation" for addition, multiplication, division etc. can be found in the literature. This grouping of multiplication and addition is a Fused Multiply-Add.

Then again, if you consider this optimized operation, you also should take into account that divisions take much longer than additions and multiplications (roughly 4 times longer).

So when you find an amount of operations for an algorithm always check what exactly the author is counting.