When we take a sample, does the relative frequencies of sample follow sample probability distributions or the population probability distribution ?
Also, why does events have to follow a certain distribution ? What natural laws force them to follow those underlying distribution ?
In principle, the sample set of data (having size, say, $M$) follows a corresponding sample distribution. For example, for a set of normally distributed data points with a priori unknown ensemble variance, this variance will need to be estimated from the data itself, which then gives rise to a theoretical t-distribution with $N$ degrees of freedom for comparison.
A major issue and difficulty is to be sure what is the precise degrees of freedom $N$ ($\leq M$), which is a parameter of the sampling distribution. Not all sample points are guaranteed to be independent (and identically distributed). If $N$ is large (and some other weak conditions are satisfied), then the central limit theorem may help to get an asymptotic distribution to approximate for the sampling distribution. If other parameter(s) of the distribution are large or small, these asymptotic distributions may in turn lead to further other asymptotic or limit distributions.
Usually, events do not "have to" follow certain distributions, particularly when these are based on models that were built for them, because models are usually approximations of reality themselves. However, if they do follow the derived or assumed distribution "closely", this increases the confidence in and accurate of that model, until a better (and possibly more complex) new model is used.