I am not sure how to interpret the formula given in my professor's slides.
We assume (for simplicity) that each operation takes the same amount of time to complete.
Time an algorithm takes on input size $N$ is then given as $T(N)$:
$$T(N) = |Ops| \times \frac{Time}{Op} $$
I am horrible at math notation, and I don't understand what the equation represents. I recognize the operation symbols, but I'm not sure what the variables like $Ops$ are supposed to represent. I assume both $Ops$ and $Op$ have something to do with operations, but I don't know what. I assume $Ops$ refers to the total amount of operations, but why would it need to be an absolute value? How can I do less than zero operations?
My question is, what does this formula represent?
