Assume a grocery company grows from having 180 to 210 shops and at the same time experiences a 7% drop in sale. How much would their sale have dropped if they had not opened the extra stores. I can see that to be status quo they should have grown by (210-180)/180=16.66 %, but I can't figure out how to get the final result.
Thanks.
p.s. Couldn't find an appropriate tag.
The only thing that makes sense is to assume all stores have the same sales and that if they had not opened those stores, each store would still have sold that amount. If a store sold one unit last year, they sold $180$ units. This year they sold $0.93 \cdot 180$ units from $210$ stores, so each store sold $\frac {0.93 \cdot 180}{210}$ and the total with $180$ stores would have been $\frac {0.93 \cdot 180}{210}\cdot 180$