Calculating time complexity of algorithms written in pseudocode.

2.1k Views Asked by At

Nowadays we are interested to find some algorithms with a prescribed running time. For example if for certain decisional problem $X$ there is an algorithm with running time $O(n^3)$ we try to break down this value. Generally in literature the algorithms are presented as pseudocodes and then follows an informal analysis of this code that terminates with the phrase "the running time is $O(f(n))$".

I'm able calculate a time complexity only for a Turing machine, and in general the time complexity depends heavily on the model of calculus we are using. In fact the extended Church-Turing thesis only ensures a polynomial loss of time in changing our model. Practically my question is the following:

Question: Given a pseudocode, how is calculated the time complexity of the algorithm? To be more precise what are the elementary operations that define "one computational step"? For example, for a Turing machine this is clear since a computational step corresponds to applying the transition function.

Thanks in advance.