I have a series of percentages:
132/220 (60%) and 88/220 (40%).
Now when I break them down into subcategories and then recalculate the percentages them come out 5% different.
81/140 (58%) and 59/140 (42%) (percentages ROUNDED).
21/40 (52.5%) and 19/40 (47.5%).
14/20 (70%) and 6/20 (30%).
16/20 (80%) and 4/20 (20%).
then we I do the averages I get:
65.125% and 34.875%
I have no idea what is happening???
This is "normal." Suppose you have $2$ big exams, each out of $100$, and you get $40\%$ in each. Average: $40\%$.
You also have $2$ little assignments, each out of $25$, and you get a perfect score in each. Average: $100\%$.
The average of the two averages is $70\%$.
However, the total mak out of $250$ was $130$. Average: $52\%$.
Your example is less extreme, but the same phenomenon is involved.
For a correct computation, you should take a weighted average of the averages.
So correct would be $(40\%)\frac{200}{250}+(100\%)\frac{50}{250}$.