I want to calculate the Kurtosis of a stream of real time data ( to test for gaussian behavior) in such a way that I do not have to hold the entire sample set before I can compute it.
To be more precise, I am getting real time data every second and after receiving the 100th sample, I have to be able to say what the kurtosis of the 100 samples is without having to store all the 100 samples? I know it is possible for mean and variance but is it possible for other higher moments such as kurtosis.
Use a ring buffer of $100$ elements and running sums. After an initialization phase to fill the buffer, every new sample will overwrite the oldest. And you will deduce the oldest sample from the accumulators and add the new one every time.
When you compute these running sums (presumably using floating-point arithmetic), care about the accumulation of truncation errors. Because when you add a term and later subtract it, cancellation will not be exact. In the long term, there will be a drift.
A solution is to restart the accumulaiton every now and then.