Been struggling to solve this question. From my notes, you can calculate MIPS through this formula:
MIPS = Instruction Count / Execution Time X 10^6
And the question goes like this:
Given an average instruction execution time of a computer(20 nanoseconds) what is the performance of this computer in MIPS?
Choices are:
a.5 b.10 c.20 d.50
and the answer is 50.
I really dont know how the hell did they manage to get the answer of 50. Please enlighten me and if possible show some solution for me to learn and to practice again.
If one instruction takes $20\,\mathrm{ns}=20\cdot 10^{-9}\,\mathrm{s}$, then how many instructions can the computer execute in one second? Precisely $\frac{1}{20\cdot 10^{-9}}$, agreed?
Divide that by $10^6$ (because we're after mega-instructions per second) to get the answer:
$$\mathrm{MIPS}=\frac{1}{20\cdot 10^{-9}\cdot 10^6}=\frac{100}{2}=50$$