Departure from uniformity in a continuous (time) distribution

56 Views Asked by At

I know how to quantify the departure from uniformity ( or a uniform distribution) for discrete distributions. Assume you have a distribution set of P:

P={P1, P2, P3, ....., PN}

The departure from uniformity for this discrete distribution is defined as follow:

D= SUM(Pi * log (Pi / (1/N)))

Now my question is how to quantify the departure from uniformity in a time distribution which is actually a continuous distribution. I have a family of time distribution functions in the form of: (B is a parameter )

P(t;B) = - (t)^(B)

Clearly, the linear distribution is resulted once B=1. How can I quantify the departure from uniformity for this family of time distributions. Thanks in advance and I do appreciate your help. Best, HRJ