When working with statistical regression models (e.g. linear regression), do we know if the "beta coefficients" within the regression model are "statistically consistent"? Can this be mathematically proven?
When looking at more "standard estimators" (e.g. the "mean" of a Normal Distribution), we can prove that MLE estimator of the "sample mean" is consistent with the "population mean" - this means that in theory, as the number of observations within the sample increases, the "sample mean" should become closer and closer to the "population mean" (provided that the population has a Normal Distribution).
- Can we say the same thing about Beta Regression Coefficients?
Suppose we have some sample data and decide to fit a linear regression model to this data and estimate the model coefficients using the Maximum Likelihood Estimators - do we know if these Beta Regression Coefficients are "statistically consistent"? Do we know that suppose we had access to an "infinite size sample" (i.e. population) - do we know if the Beta Regression Coefficients estimated from the "sample data" will become "closer and closer" to the value of the same Beta Regression Coefficients calculated from the "population data" ... as the size of the sample increases?
Thank you!