Why and how does the log transform reduce skewness in a dataset to make it resemble normal distribution?

61 Views Asked by At

I am studying a statistics book. At some point when working with a dataset, the book applies log-transform to one of the columns because the data is skewed and we want it to be as close to a bell curve (normal distribution) as possible. I understand the reason for using some transform to do this, this is basically scaling. I just don't really understand why the log-transform does this. How exactly does it do this, from a mathematical stand point?

Is this because the normal distribution's density function is in the format of e to the power of something, so it's like a reverse of log natural base? So if we want to get to the normal distribution we use log? Or am I totally off?

This isn't part of homework or anything. The book is mostly coding, so it offers nothing more about the theoretical stuff. This is just me being curious.