Can someone clarify if it is safe to declare that a distribution is not exponential if the mean and standard deviation are not equal, for example coefficient of variance, c < 1 and that it is exponential if c = 1.
The question is based on the argument that if this is indeed true then why are there so many tests out there that test the hypothesis?
Many thanks.
Counter Example: $X \sim N(1,1)$
So, the answer is no. Just because a distribution's mean and standard deviation are identical it does not follow that the distribution is an exponential.