I have a black box algorithm that calculates a volume of a liquid that passed through a sensor. I also have a way to "re calibrate" the algorithm to produce a new out come.
e.g. I poured some fluid, weighted it, and it's 355 grams.
The algorithm claims that's 349 ml.
I re-calibrate it, pour some fluid again, and wight it at 355 grams. The algorithm claims that's 354 ml.
I know that 349g / 354g = 0.98587570621.
When I multiply 354g * 0.98587570621 = 348.999999998ml
Therefore I can claim that the old calibration was 1.412429379% under counting compared to the new one.
How can I do this when I don't have exact weights and volumes?
Let's say calibration 1 was 378g which produced 372ml.
And calibration 2 is 358g which produced 352ml.
Is there any way to calculate the % by which these volumes are inaccurate in relation to each other based on the weights?