If normalization would be to rescale the range of values to scale between $0$ and $1$ as follow:
$x^{'}=\frac{x-\min(x)}{\max(x)-\min(x)}$ where the interval of $x^{'}$ is closed between $[0,1]$
where $x$ is the actual value and $x^{'}$ is the normalized value.
If $z$-score standardization would be to transform the features to have zero mean and unit variance as follow:
$x^{'} = \frac{x-\mu}{\sigma}$
where $x^{'}$ is the $z$-score, $\mu$ is the mean and $\sigma$ is the variance. The unit of this $z$-score would be $\sigma$.
Now my question, what would be the unit of the following formula?
$x^{'}=\frac{x-\mu}{\max(x)-\min(x)}$
where $x^{'}$ is the standardized value, $\mu$ is the mean and $\max(x)$, $\min(x)$ are the maximum and minimum values respectively.
I know it will have an open interval between $0$ and $1$, as in $(0,1)$, but its not clear for me what the unit is, or what the unit mean.
Any shining light would be much appreciated.
You're probably familiar with the interpretation of the $z$ score as "the number of standard deviations the test statistic is from the mean". This interpretation can be obtained from the formula for the $z$ score.
Consider,
$$z = \frac{x-\mu}{\sigma},$$
if we multiply $\sigma$ to the left hand side we get,
$$ z \sigma = x-\mu,$$
here we see that $x-\mu$ is equal to the standard deviation $z$ times.
There is a similar interpretation for your variable $x'$.
The definition of $x'$ is,
$$ x' = \frac{x-\mu}{\text{range}},$$
we can once again multiply the range to the left hand side obtaining,
$$ x' \text{range} = x-\mu.$$
Here we see that $x-\mu$ is equal to the range $x'$ times. In other words $x'$ is the fraction of the range which makes up the distance between $x$ and $\mu$.