If there are two real values called week1val and week2val, what operation is required to calculate the percentage of difference between the two values? Does the higher of the two values need to be used for calculation, the lower of the two, or what?
For example, with these particular values:
week1val = 2.77
week2val = 2.84
diff = (week2val - week1 val) // (0.07)
...how is the percentage of difference between the two values computed? Is it like so:
pct = diff div [the larger of week1val and week2val] // In the contrived case: "0.07 div 2.84"
...or:
pct = diff div [the smaller of week1val and week2val] // In the contrived case: "0.07 div 2.77"
...or some other way???
You can express this different ways, but the easiest and traditional is to state by what percentage of the first value the second value differs from the first value.
so: ${week2val - week1val \over week1val}$, and then multiply by $100$ to express as a percentage.
Thus ${2.84 - 2.77 \over 2.77} 100 = 2.527 \%$. You would say, "the second week value is $2.527\%$ greater than the first week value".