I understand how normalization works. You sum up the individual values of the vector, you divide each value by the sum, and voila... they sum to 1.
Why doesn't it work when you subtract them from 1? If all values sum to 1 after normalization, then shouldn't the ratios work out so (1 - X_1).sum() == 1 ?
What am I doing wrong? I want basically want the smaller values to have a higher weight in the end. Where a value of 0.0 would be 100% (i.e. 1.0).
# Python 3 with NumPy
X = np.array([[ 58.50853002, 74.73077551, 54.46120887, 55.55526553],
[ 68.14133201, 22.2475803 , 88.79126866, 86.24927424],
[ 43.86150599, 75.99344646, 81.90051932, 50.66885662],
[ 74.81149378, 82.86920509, 36.75953127, 58.42956957]])
# Get 2nd row
X_1 = X[1,:]
# array([ 68.14133201, 22.2475803 , 88.79126866, 86.24927424])
# Get ratios for row
ratios = X_1/X_1.sum()
#array([ 0.25672106, 0.0838173 , 0.33451927, 0.32494236])
# Sum to 1
ratios.sum()
# 1.0
# Shouldn't this work?
(1 - ratios)
# array([ 0.74327894, 0.9161827 , 0.66548073, 0.67505764])
# But it doesn't...
(1 - ratios).sum()
#3.0
Let me try to understand what you are doing:
First the sum: $$ S(x) = \sum_i x_i $$ Then the described normalization: $$ x' = x / S(x) $$ So $$ S(x') = \sum_i x_i' = \sum_i \frac{x_i}{S(x)} = \frac{1}{S(x)} \sum_i x_i = \frac{1}{S(x)} S(x) = 1 $$ OK, the normalized vector $x'$ has unit sum. (Beware: if $S(x)$ was zero, you are in for a surprise)
Fine, if we have the new vector $y$ with $$ y_i = 1 - x_i' $$ then we get $$ S(y_i) = \sum_i y_i = \sum_i (1-x_i') = \left( \sum_i 1 \right) - \left( \sum_i x_i' \right) = n - S(x') = n - 1 $$ So unless $n=2$ (two components in the vector), this would not return a vector $y$ with sum $1$.