Allow me to illustrate my question with example of a coin. We say that coin is fair when it has discrete uniform distribution of outcomes. But we can treat fairness/unfairness as gradual quality, with "unfairer" meaning that the discrete distribution of outcomes of a coin is further from the uniform discrete distribution than the discrete distribution of another coin. For example, we could say that a coin that has $100\%$ chance of heads is unfairer than a coin that has $51\%$ chance of heads. Coin with $49\%$ chance of heads and coin with $51\%$ of heads would be equally unfair because they are equally far from the uniform discrete distribution. In case of coins it's easy to judge distance between the discrete uniform distribution and the given discrete distribution of coin's outcomes. We just need to calculate $|50 - x|\%$.
Unfortunately, I have no idea how to calculate the distance from the uniform discrete distribution in case of dice. Or in other words, I don't know how compare what die is unfairer than another die (assuming that all dice are loaded). Any ideas?
There's an infinite number of ways of measuring the deviation-from-fairness of a die.
With regard to distributions there's a number of common choices, such as Kullback-Leibler divergence, and a Kolmogorov-Smirnov type distance (biggest vertical difference in cdf).
You could indeed also use the suggestion in comments of the largest deviation (over all pairs) in individual probabilities or the largest deviation from a fair proportion ($\frac16$ for a six-sided die).
When looking at a sample of values on the die, a couple of common choices of deviation from fairness are chi-squared distance and Kolmogorov-Smirnov distance on the empirical cdf.
There are many other choices that have been used. Which may be best depends on what you want to achieve/what properties you want.