Calibration Resolution

27 Views Asked by At

I have a flow meter that I calibrate by doing timed catches. The best resolution I have on my scale is down to 0.01 lb. The meter I'm comparing against goes to 0.001 lb. Usually, I'll have something that flows about 20 lb/hr and I'll do a 5 minute catch on it.

This got me thinking - is there a way to mathematically decide what the minimum catch time would need to be for a good calibration based on the resolutions and flow rate? If someone could show me the math, that'd be extremely helpful, but at least knowing some equations or theory to search on my own would also help.