Deriving variance

40 Views Asked by At

In kinetic methods of analysis, the rate of appearance of products is often used to infer the initial concentration of a reactant or substrate. The problem with this method is that often the rate constant, k_r, fluctuates from one run to the next due to variations in temperature, pH, etc. This problem can be minimized by choice of time along the course of the reaction the rate is measured.

For a simple first-order reaction: k B ---> products

the reaction rate at any time, t, can be written as: -d[B]/dt = R = k [B]o exp(-kt)

Derive the error (variance) in the measured rate, R, assuming that the only contribution to the error is from random fluctuations in the rate constant (perhaps due to temperature fluctuations) whose magnitude are described by a variance, σk^2 Use this result to show that there is a unique time, to, along the course of the reaction when the relative standard deviation of the measured rate, σR/R, is insensitive to small errors in the rate constant.

I really just need help understanding where it all comes from. I'm not getting it yet. Thank you for reading and hopefully for responding