Imagine you have a device collecting data at not very regular time intervals. For plotting on graph or further processing, you'll want to turn the samples in regular ones, with constant time intervals:

The problem is not as easy as it may seem. You can't just average the point values in the neighbouring columns:

Imagine such situation:

Obviously, the real average is more influenced by the points closer to our sample time:

I've been thinking quite hard about it. Someone suggested linear interpolation. But as you can see in the images, linear interpolation is not what we really can apply on scattered points. But it gave me this idea:

Ok, I overcomplicated the image. What I mean is that I can interpolate every two points on the opposing sides of our time and make average out of intersections between the interpolation and our offsetted Y axis. Let's try it on the image we used with normal averaging:

Yes, I think this looks much more right. But common sense doesn't allways work with math, which is why I'm posting here.
- Is my idea correct? Can I improve it? Did I take the wrong path?
- if it actually is right, I am missing the approach for generating average when you have points on only one side.
- How could I go about calculating these? I think there must be a way without actually calculating line intersections - the Y is constant...
I've been toying with different implementations of weighted averages using (what would in your case be) the difference between your discrete time and your sample times. I'm still working on finding a good solution, but some papers that might help:
Comparison of correlation analysis techniques for irregularly sampled time series http://www.nonlin-processes-geophys.net/18/389/2011/npg-18-389-2011.pdf
A Framework for the Analysis of Unevenly Spaced Time Series Data http://www.eckner.com/papers/unevenly_spaced_time_series_analysis.pdf
A Note on Trend and Seasonality Estimation for Unevenly-Spaced Time Series www.eckner.com/papers/trend_and_seasonality.pdf