I want to answer the question: On average how many sticks of butter per dollar.
Given the following
Dollars Butter Butter/Dollars
1 2 2
1 2 2
1 2 2
2 2 1
2 2 1
2 3 1.5
2 3 1.5
1 3 3
Sum 12 19 14
Avg 1.5 2.375 1.75
AvgDollars/AvgButter = Sum(Dollars)/sum(butter)= 2.375/1.5= 1.58, is my answer I believe.
However, I have no intuition as to what the Avg(Butter/Dollars) = 1.75 represents or why its incorrect. Can anyone explain?
The table just computes the average of the third column, $\frac {14}8=1.75$
It is simply not the case that the average (butter per dollar) is equal to the average(butter)/average(dollar). At least not in general.
Consider the following
$$ \begin{array}{c|lcr} n & \text{Dollars} & \text{Butter} & \text{Butter Per Dollar} \\ \hline 1 & 1 & 1 & 1 \\ 2 & 2 & 2 & 1 \end{array} $$
In this instance the two do in fact coincide! Both are $1$.
But now suppose we permute the Butter Column, to get:
$$ \begin{array}{c|lcr} n & \text{Dollars} & \text{Butter} & \text{Butter Per Dollar} \\ \hline 1 & 1 & 2 & 2 \\ 2 & 2 & 1 & \frac 12 \end{array} $$
The ratio of the two averages is still $1$, of course. We haven't changed those averages. But the average(butter per dollar) is now $\frac {2.5}2=1.25$