Alternative to averages when comparing variables with different frequencies

114 Views Asked by At

I have a table that looks like this:

| Date         | Day       | View % |
|--------------|-----------|--------|
| 1 June 2020  | Monday    | 18%    |
| 8 June 2020  | Monday    | 20%    |
| 16 June 2020 | Tuesday   | 15%    |
| 22 June 2020 | Monday    | 22%    |
| 24 June 2020 | Wednesday | 30%    |
| 1 July 2020  | Wednesday | 22%    |

Of course, the original data is big but I hope you got it!

View % is the measure of people watching a television show on the respective days. Each day listed in the table indicates a different date. Let's assume the total audience size is 100. Also, assume all other factors (seasonality, events etc.) to be constant.

On a particular Monday, 18% or 18 people watched the show. Next Monday (different date), 20% or 20 people watched the show, and so on.

I need to find which day is best for the show in terms of view %. Taking simple averages doesn't seem right as I'll be comparing the average of 3 Mondays with 2 Wednesdays with 1 Tuesday.

What should be the correct approach here?