Correct way to calculate mean values for timed data

65 Views Asked by At

If I have a sensor in the road, that measures the date+time of every vehicle that passes and it's current speed. These sampling points are more frequent in rush hour and far less frequent during the night. Traffic lights might also make cars pass more frequent one minute and far less another. Now, i need to calculate the mean or average number of cars that passed and their average speed for periods of 30 minutes. E.g. in rush hour from 16:30 to 17:00 and 17:00 to 17:30 etc. But what is the correct way to do this when displaying in a graph or chart? Lets take the 17:00 to 17:30 samples. Lets say we have 1152 samples. Do I just calculate the mean (add all speeds and divide by 1152). What time should I display for this mean value in the chart? Do I have to calculate the average time of the samples (add all the times and divide by 1152) and this is the time of the mean value? If I want to have the mean exactly at 17:00. Do I take values from 16:45 to 17:15 or do I need to find an equal amount of samples before and after 17:00 or what is the correct way to calculate the mean for a specific point in time? I hope you can understand my issue (there are probably a correct name or term for this) and that someone can direct me to some papers about these type of calculations or just give me an easy to understand answer here (even though I suspect there is no easy way to do this).