This is the context:
1.3(10) = 13
13 - 10 = 3
|3| / 10 = 0.3
0.7(10) = 7
7 - 10 = -3
|-3| / 10 = 0.3
(1.3)(0.7) != 1
If the difference between both real numbers is 0.3, when multiplied together, why does the number < 1 hold greater influence over the product (i.e. 1.3*0.7 = 0.91) ?


Note that $$1.3\cdot 0.7=1\cdot 0.7+0.3\cdot 0.7$$ is the sum of $0.7$ and $0.3$ of $0.7$. That is, when we "add back in" the $0.3$ by multiplying by $1.3$, we're adding only $0.3$ times $0.7$, not $0.3$ times the original $1$ we started with.
Alternatively, if you think of it as $$1.3\cdot 0.7=1.3\cdot 1-1.3\cdot 0.3,$$ we're subtracting $0.3$ times $1.3$, not $0.3$ times our original $1$. So we're subtracting more than we originally added to get $1.3$ from $1$, and so we end up smaller than $1$.