I was learning some robust estimation methods dealing with outliers and heavy-tail. I noticed that Tyler's M-estimator, whose key idea is to standardize the sample data by the distance to the mean, has an MLE like,
$\hat{\Sigma}=\sum_{i=1}^{N}w_ix_ix_i^T, w_i=\frac{K}{N\left(x_i^T{{\hat{\Sigma}}^{-1}x}_i\right)}$
Since $w_i$ adds the weight to every sample data by dividing the distance, I was thinking that it seems this method only takes the angle factor into consideration of how it affects the covariance, but the distance or scale factor is missed.
For example, if all the sample data points are drawn 2 times away from the central, it seems the covariance matrix wouldn't change. But in common sense, it validates more severely so that the covariance should increase.
I'm not sure where I made a mistake. Thanks for your comments and reply.