I have a computer program that performs a task in a set of files. The time the program takes to finish is a function of how many files it is processing and the sum of their sizes, for example;
programTime(12 files, 24389 bytes) = 12 seconds
programTime(1 file, 24389 bytes) = 2 seconds
What is a good math technique that I can use to predict processing time for any combination of number of files and their total size?
Points to consider:
- I know I will have to deal with error margins, like in any method of extrapolation
- I can record the actual time of processing for every program execution in order to better improve later extrapolations (add a correction factor to my function)