I have a simulated time-series of the energy production of a wind turbine in 10 minute intervalls for 2 years (105.000 time steps), ranging from [0,500]. Now, I included reductions of a certain factor to simulate an overloaded grid. These happen in 866 of those 105.000 time steps and are put into the time-series randomly. That means, it depends on chance if a lot of energy is reduced or only a little, which means that the total amount of reduced energy is not constant, but depends on probabilities and is therefore different for every simulation. In order to present the results, I want to calculate a range, in which the sum of the reduced energy will lie with a certain probability.
So far the approach was using the Central Limit Theorem and create a Normal Distribution. The corresponding mean value of the time series is 94 and stdev = 96, leading to a Normal Distribution, with a mean value of 866*94 = 81404 and a stdev of 2849. Does this Normal distribution depict the situation appropriately or is there other facts I have not yet taken into account?
Edit:
The reduction happen in 3 different intensities: In 52% of the cases, the power production is reduced to 0, in 30% of the cases the output is reduced to 30% and in 18% of cases it is reduced to 60%. I left that out so that the problem is easier to understand. If you want the exact problem, there is also 60 and 120 minute long reductions, but I think we should stick to the 10 minute reductions for now.
It is not depending from the value of the time series.
c1) do you know the distribution of the values of the power in the ten minutes?
I dont know exactly what you mean by this. I have the exact time series and therefore I know each value and can calculate the distribution, mean values and variances exactly: From the time series, 15% of the values are < 17 25% < 27 50% < 66 75% < 110 85% < 165