Suppose you are given the task of improving the performance of a program consisting of three parts. Part $A$ requires 20% of the overall run time, part $B$ requires 30%, and part $C$ requires 50%.
You determine that for \$1000 you could either speed up part $B$ by a factor of 3.0 or part $C$ by a factor of 1.5. Which choice would maximize performance?
I think if you speed up part $B$ by a factor of 3.0, then you only spend 10% of your time on that, saving 20% time. But if you speed up part $C$ by a factor of 1.5, then you spend 25% time and save 25% time.
So is speed up part $C$ the right answer since you save 25% vs 20%?
Thanks
Let $x$ be the runtime of your program.
Then
\begin{align} t(A) :&= a = 0.2 x\\ t(B) :&= b = 0.3 x\\ t(C) :&= c = 0.5 x\\ \end{align}
Speed something up by a factor of 2 (double as fast) means to reduce the runtime to half of the time it needed before. The thought behind that is that you can do twice as much in the same time.
Similar, I think when you say "speed up part $B$ by a factor of 3.0" it means reducing the runtime to $\frac{1}{3}$ of the time before. So a speed up of the factor of 1.5 means the new runtime is $\frac{1}{1.5} = \frac{2}{3}$ of the original runtime.
So the question is
$$a + \frac{1}{3}b + c < a + b + \frac{3}{2}c$$
or
$$a + \frac{1}{3}b + c > a + b + \frac{3}{2}c$$
Now resubstitute:
\begin{align} 0.2x + \frac{1}{3} \cdot 0.3x + 0.5x &= \frac{1}{2}x + \frac{1}{10}x\\ &= \frac{6}{10}x\\ &< \frac{5}{10}x + \frac{1}{3} x\\ &= \frac{1}{2}x + \frac{1}{3} x\\ &= 0.5x + \frac{2}{3} \cdot \frac{1}{2} x\\ &= 0.2x + 0.3x + \frac{2}{3} \cdot 0.5 x\\ &= a + b + \frac{2}{3}c \end{align}
So the answer is: Reduce the execution time (= speed up) part $B$.