In time complexity, is there a meaningful difference between O($2^n$) and O($2^{n/c}$)

70 Views Asked by At

A table $T$ has $n$ rows and columns, and all the contents of $T$ are of the same data type.

An algorithm chooses a row $r$ and from it successive combinations $c$ of values until the row is no longer needed. For every $c$ in $r$, the algorithm returns either true or false in O($c^2$) time.

In general, for every $r$ there are $2^n$ combinations of 1 . . . $n$ values on which to operate. However, the algorithm will permanently discard the row if the $2^{\lceil n/2 \rceil\textrm{th}}$ $c$ in any $r$ returns false.

If I'm calculating this correctly, it means that the time required to discard a single row is O($2^{n/2}$), and since there are $n$ rows, the time required to completely exhaust the table is O($n^{n/2}$).

Is that math correct, and does dividing the $n$ in the exponent by a constant mean that the algorithm completes in polynomial time?