Optimizing power consumption at server farm

55 Views Asked by At

I have the following task.I'll need to optimize power consumption on some server farm and the servers need to auto scale depending on the historical usage.I wondered what would be a good way of thinking about the problem and then I though of calculus of variations and the shortest path problem.I want to describe the auto-scale algorithm as a shortest path on the CPU usage axis vs time.It will be subject to some constraints such as time for launching a new instance.Imagine the picture below shows CPU usage (x(t)) vs time.

enter image description here

I have a few questions :

Is calculus of variations suitable for dynamical systems such as this one ?I can get a decent looking function from the data and make a good approximation, but is it a good idea to look at my problem from this angle ? Is calculus of variations only applicable in physics and engineering ?

I am new to calculus of variations, so please be gentle :) I am seeking for more intuition on the subject rather than hardcore theory.