I am working on generalizing the time complexity for a given algorithm by using mathematical notation. The idea is that the running time of an algorithm is the sum of running times of the statements of the algorithm.
Let $n$ be the input size
Let $T(n)$ be the running time of an algorithm
Let $s$ be the number of statements in the algorithm
Let $c$ be the cost of a statement measured by steps of the statement.
Let $t$ be the times that each statement has to execute dependent on $n$.
$T(n) = \sum_{s}(c_s*t_s)$
In order to successfully calculate $c$ or $t$ one must know the details about the algorithm in question.
For example, lets say we have an algorithm with the following statements:
- A programmatic flag set to true(happens only once at the beginning of the algorithm)
- Some for or while loop
We could say that $t_s$ when $s=1$ is constant and doesn't scale with the input size. When we move to statement 2(for loop) we could say that $t_s$ when $s=2$ does scale with the input size because $t_2 = n$(the for loop will execute n times).
How do we capture this relationship between $t$ and $n$ in mathematical notation without knowing the algorithm in question? Would $T(n) = \sum_{s}(c_s*t_s)$ be acceptable to capture the relationship?