How to formalize (in simple terms) when average of ratios would be greater than ratio of averages?
I found that sometimes the former is greater than the latter and sometimes the latter is greater than the former.
I want to understand in what cases one would be greater than the other.
What happens at asymptotic?
And what happens in day to day calculations? Which should I expect to be greater?
Let us start the conversation with just 4 numbers for simplicity.
Note that neither the Average of Ratios $$\frac{\frac{a}{b}+\frac{c}{d}}2 \tag{AR}$$ nor the Ratio of Averages $$\frac{\frac{a+c}2}{\frac{b+d}2}=\frac{a+c}{b+d} \tag{RA}$$ change when all 4 numbers are scaled together, to the relationship between AR and RA depends only on the projective point $a:b:c:d$.
Now, let us scale $(a,b)$ to $(\lambda a,\lambda b)$, then AR does not change and RA goes from $\frac{a}{b}$ for $\lambda=\infty$ to $\frac{c}{d}$ for $\lambda=0$.
This should give you intuition you are asking for: ratio of averages is the average of ratios weighted by their denominators: $$ \frac{a+c}{b+d} = \frac{b}{b+d}\times\frac{a}{b} + \frac{d}{b+d}\times\frac{c}{d}$$ while AR is their average with weights $\frac12$, and this generalizes directly to an arbitrary number of ratios.
Thus AR will be bigger when the bigger ratio has a bigger denominator.
Related: