Averaging Datasets with Inconsistent Time Points

76 Views Asked by At

I am trying to find the equivalent of a "mean" and "standard deviation" of some time-dependent datasets, with the added complication that not all datasets were taken at consistent time points.

Say, for example, you set 3 separate ovens to 350 degrees and measure the temperature response.

enter image description here

All the ovens follow a similar temperature profile, but each timepoint is distinct. Additionally, most of the datapoints were taken at inconsistent times (e.g. only oven 1 has data at 6 min, only oven 2 has data at 5 min, and only oven 3 has data at 5.5 min). I want to know the "average" temperature profile as the ovens approach 350, and the spread in these profiles. It would be excellent if I could construct an average curve with time-dependent error bars. I'm not sure how to do this, but there must be some procedure.

I thought about picking some easy time points (e.g. 1,2,3...), interpolating the curves that don't have data at these times, and averaging like normal from these calculated values, but that seems to lack rigor. Is there a better option?

Sorry for the somewhat contrived example and the excel graph, but I think it's demonstrative of the kind of dataset I'm thinking about.