Prove
Consider ordered real numbers $x_1 \le x_2 \le \dots \le x_n$ and $y_1 \le y_2 \le \dots \le y_n$.
Let $ \sigma : \{ 1,2,\dots,n\} \rightarrow \{ 1,2,\dots,n\}$ be a permutation on the integers $1,2,\dots,n$.
Show that $\sum\limits_{k=1}^n x_ky_{\sigma(k)} \le \sum\limits_{k=1}^n x_ky_k$.
My Intuition
Algebraically, to maximize a weighted sum, we want to put the heaviest weights $x_k$ on the heaviest values of $y_{\sigma(k)}$. The easiest way to do that is to order them in the same way. In other words,
$max\{ \sum\limits_{k=1}^n x_ky_{\sigma(k)} \} = \sum\limits_{k=1}^n x_ky_k$
Geometrically, the permutation of $y_{\sigma(k)}$ which satisfies this condition would be the one which is ordered in the same direction as $x_k$. To maximize the dot product, we maximize the cosine or minimize the angle between these two vectors. Since permutations would only change the direction of the vector, not its length, we pick the permutation which points in the same increasing direction as $x_k$.
I'm having trouble formalizing a proof for these observations.
Any hints on either a geometric or algebraic approach for a proof would be appreciated.
First Attempt
Using Steve Kass' comment as a guide, I will attempt a proof by contradiction.
Suppose we choose a $\sigma(k)$ that is out-of-order. For example, $y_2$ and $y_1$ are switched.
We claim that $\sum\limits_{k=1}^nx_ky_{\sigma(k)} \color{red}{>} \sum\limits_{k=1}^nx_ky_k$
Common terms on rhs/lhs cancel out giving us,
$(x_1y_2 + x_2y_1) \color{red}{>} (x_1y_1 + x_2y_2)$
$y_2(x_1-x_2) + y_1(x_2-x_1)\color{red}{>} 0$
$y_2(x_1-x_2) - y_1(x_1-x_2)\color{red}{>} 0$
$(y_2-y_1)(x_1-x_2)\color{red}{>} 0$
but this is a contradiction, because the product of a positive number and a negative number cannot be greater than zero.
Therefore, our original assumption was wrong. We are left with:
$\sum\limits_{k=1}^nx_ky_{\sigma(k)} \le \sum\limits_{k=1}^nx_ky_k$.
Similar arguments can be made for any $\sigma(k)$ that is out-of-order.
Second Attempt (Proof By Induction)
We will generalize the proof for any out-of-order weightings. We want to prove
$max\{ \sum\limits_{k=1}^n x_ky_{\sigma(k)} \} = \sum\limits_{k=1}^n x_ky_k$
given
$x_1 \le x_2 \le \dots \le x_n$
$y_1 \le y_2 \le \dots \le y_n$
Note that any permutation can be decomposed into a series of transpositions, or a permutation of two elements. Any transposition can be ordered in two ways, with the coordinates in the same (increasing or decreasing) direction or in the opposite directions.
We start by defining indexes $i<j$ and $k<l$. So $x_iy_l + x_jy_k$ is ordered in opposite directions, while $x_iy_k + x_jy_l$ is weighted in the same direction. We make the assumption that the former sum is larger than the latter.
$(x_iy_l + x_jy_k) \color{red}{>} (x_iy_k + x_jy_l)$
$(x_iy_l + x_jy_k) - (x_iy_k + x_jy_l) \color{red}{>} 0$
$x_i(y_l - y_k) - x_j(y_l - y_k) \color{red}{>} 0$
$(x_i-x_j)*(y_l - y_k) \color{red}{>} 0$
Again, we have a contradiction, because the product of a negative number and a positive number cannot be greater than zero. So the correct expression is
$(x_iy_l + x_jy_k) \le (x_iy_k + x_jy_l)$
The transposition with the matched order produces the greater sum.
From here, one step at a time, we can then increase the value of the weighted sum by applying transpositions until the two vectors are entirely matched up in ordering. Any further transposition would only decrease the sum, so at this point, we have reached the ordering for the maximal sum.