what is the complexity of this type of algorithms (loop bounded)?

160 Views Asked by At

I have an algorithm which contains only the instructions of type:

  • $X_i=X_j$
  • $X_i=X_i+1$
  • $\text{while }(X_i\le N)\text{ do }\{C\}$ where $C$ is another instruction

    $N$ is a global constant and Initially all variables are equal to $0$

    I want to prove that if this algorithm terminates then the number of executed affectations is polynomial on $N$. I think that this is already done by someone else so I need some references for this problem (or likewise) if someone can provide a proof or the idea behind it, it will be greatly appreciated.

2

There are 2 best solutions below

3
On BEST ANSWER

Note that all numbers greater than $N$ are indistinguishable to the program from each other. With no change to running time, we can replace all instructions $X_i = X_i+1$ with $X_i=\min(X_i+1,N+1)$. This way all variables are confined to $\{0,1,2,..,N,N+1\}$.

Create a graph of all possible states of an executing program. The vertices are tuples $(X_1, X_2, \dots, X_k, L)$ where each $X_i$ is in $\{0,1,2,..,N,N+1\}$ and $L$ is a location in the program.

A walk in such a graph is either a cycle (the program does not terminate) or a path of length $O(N^k)$.

3
On

I will assume that by "affectations" you are referring to instructions, I am not familiar with the use of that word in complexity theory. Also, note that $N$ is your input for this question to make sense, and so it is slightly confusing to describe it as a global constant, even though the program treats it that way.

Proceed inductively on the maximum loop depth $d$, considering no while loops to imply a depth of 0. Also, define $k$ to be the number of variables used anywhere in the program. Furthermore, to make use of the inductive hypothesis, strengthen it by assuming the variables are initialized with non-negative values, rather than making the strictly stronger assumption of zero initialization.

For the base case, the algorithm executes each instruction precisely once and so has constant running time, which of course is a complexity of $O(N^d)$.

Suppose $d>0$ and that algorithms of this type and loop depth $d^\prime$ have complexity $O(N^{d^\prime})$ whenever $d^\prime<d$. First consider the case that the algorithm consists of a single loop. Then each of the constant number of outer level instructions in this loop is executed exactly once (i.e., those instructions not inside inner loops, but including the outermost inner loops as single instructions). By the inductive hypothesis, each of these executions runs in time $O(N^{d-1})$. Then execution of the entire loop body runs in a time that is a constant multiple of this, and so also runs in time $O(N^{d-1})$.

Now we need to determine how many times the outer loop will be executed. If any iteration fails to modify any variables, the next iteration will be identical, and the loop runs forever. We are assuming a halting program, so this is not the case. There can be at most $kN$ incrementations before every used variable is at least $N$ and the loop terminates. Assuming the worst case scenario, this means $kN$ iterations, as each iteration performs at least one incrementation. Then there are $O(N)$ iterations, each consisting of a constant number of instructions each taking $O(N^{d-1})$ time. Then the algorithm executes in time $O(N*N^{d-1})=O(N^d)$.

Now generalize to any program with loop depth $d$. This algorithm consists of a constant number of subalgorithms with complexity $O(N^d)$, and so in the general case the running time is also $O(N^d)$. This completes the proof by induction.

Finally, note that a running time of $O(N^d)$ is polynomial, because $d$ is a constant associated with the exact algorithm being considered.