Let $\{A_i\}_{i=1}^n$ and $\{B_i\}_{i=1}^n$ be two sets of i.i.d. random variables (say Bernoulli's or sub-Gaussian). Assume that $\{A_i\}_{i=1}^n$ and $\{B_i\}_{i=1}^n$ are statistically independent as well. Consider the sum $$ S_n = \max_{\pi\in\Pi_n}\sum_{i=1}^nA_iB_{\pi(i)}, $$ where $\pi$ is a permutation over $\{1,2,\ldots,n\}$ and $\Pi_n$ is the set of all possible permutations. I am interested in "good" upper bounds on the tail probability, i.e., $\Pr(S_n>t) $, for $t\in\mathbb{R}$.
One way to approach this is to use the union bound over $\Pi_n$, namely, $$ \Pr(S_n>t)\leq n!\Pr\left(\sum_{i=1}^nA_iB_i>t\right), $$ and then the probability term above can be easily upper bounded using the distributional assumptions on $\{A_i\}_{i=1}^n$ and $\{B_i\}_{i=1}^n$.
The question is whether there is an elegant way to deal this maximum other than just by applying a union bound.
For any event $\omega\in \Omega$, according to the rearrangement inequality $$\sum_{i=1}^n A_i(\omega)B_i( \omega) \le \sum_{i=1}^n A_{(i)}(\omega)B_{(i)}( \omega)$$ where $(A_{(i)})_{i=1,...,n}$ and $(B_{(i)})_{i=1,...,n}$ are the order statistic of $(A_{i})_{i=1,...,n}$ and $(B_{i})_{i=1,...,n}$. In order words, the sum $\sum_{i=1}^n A_i(\omega)B_i( \omega)$ reaches its maximum when $(A_{i})_{i=1,...,n}$ and $(B_{i})_{i=1,...,n}$ in the same order.
So, we have
$$\color{red}{S_n=\sum_{i=1}^n A_{(i)}B_{(i)}} \tag{1}$$
Case 1: For the distribution like Bernoulli as in the question, I assume that these iid random variables $A_i$ and $B_i$ follow the distribution of the random variable $X$ with non-negative support. We will apply the Markov's inequality to $S_n$, given the assumption that these $A_i, B_i$ have non-negative support. $$\mathbb{P}(S_n>t) \le\frac{\mathbb{E}(S_n)}{t}=\frac{\sum_{i=1}^n\mathbb{E}(A_{(i)}B_{(i)})}{t}= \frac{\sum_{i=1}^n\mathbb{E}^2(A_{(i)})}{t} \tag{2}$$
For any $i \in \{1,2,...,n \}$, according to this result, the density function of $A_{(i)}$ and $B_{(i)}$ are $$f_{(i)}(x)=\frac{n!}{i!(n-i)!}f_X(x)F^{i-1}(x) (1-F(x))^{n-i}$$
Then, applying the Cauchy-Schwazt inequality (at the step 3-4), we have:
$$\begin{align} \mathbb{E}^2(A_{(i)}) &= \left(\int_{0}^{+\infty} x f_{(i)}(x)dx\right)^2 \\ &=\left(\frac{n!}{i!(n-i)!}\right)^2\left(\int_{0}^{+\infty} x f_X(x)F^{i-1}(x) (1-F(x))^{n-i}dx\right)^2 \\ &=\left(\frac{n!}{i!(n-i)!}\right)^2\left(\int_{0}^{+\infty} (x \sqrt{f_X(x)})\sqrt{f_X(x)}F^{i-1}(x) (1-F(x))^{n-i}dx\right)^2 \\ & \le \left(\frac{n!}{i!(n-i)!}\right)^2 \left(\int_{0}^{+\infty} x^2 f_X(x)\right)\left(\int_{0}^{+\infty} f_X(x)F^{2i-2}(x) (1-F(x))^{2n-2i}dx\right)\\ & = \left(\frac{n!}{i!(n-i)!}\right)^2 (\mathbb{V}(X) + \mathbb{E}^2(X))\left(\int_{0}^{1} u^{2i-2} (1-u)^{2n-2i}du\right)\\ \end{align}$$
Then $$\sum_{i=1}^n\mathbb{E}^2(A_{(i)}) \le (\mathbb{V}(X) + \mathbb{E}^2(X)) \sum_{i=1}^n \left( \left(\frac{n!}{i!(n-i)!}\right)^2 \int_{0}^{1} u^{2i-2} (1-u)^{2n-2i}du \right) \tag{3}$$
From $(2),(3)$, we deduce that
$$\mathbb{P}(S_n>t) \le \frac{1}{t}(\mathbb{V}(X) + \mathbb{E}^2(X))\color{red}{\sum_{i=1}^n \left( \left(\frac{n!}{i!(n-i)!}\right)^2 \int_{0}^{1} u^{2i-2} (1-u)^{2n-2i}du \right)} \tag{4}$$
The RHS of $(4)$ is a known function of $n$ and variance of $X$. We can make it a more elegant (but less tighter) by remarking that
$$\frac{n!}{i!(n-i)!} \le \frac{n!}{\left(\frac{n}{2} \right)!^2}$$
Then, the red term of $(4)$ is smaller than $$ \le \frac{n!}{\left(\frac{n}{2} \right)!^2}\sum_{i=1}^n \left(\frac{n!}{i!(n-i)!} \int_{0}^{1} u^{2i-2} (1-u)^{2n-2i}du \right) \le \frac{n!}{\left(\frac{n}{2} \right)!^2} \int_{0}^{1} (u^2 + (1-u)^2)^n du$$
So, a more elegant bound of $(4)$ is $$\color{red}{\mathbb{P}(S_n>t) \le \frac{1}{t}(\mathbb{V}(X) + \mathbb{E}^2(X))\frac{n!}{\left(\frac{n}{2} \right)!^2} \int_{0}^{1} (u^2 + (1-u)^2)^n du }\tag{5}$$
Case 2: For distribution of type sub-Gaussian (that includes the Gaussian distribution)
Applying the Chernoff's inequality, we have: $$\mathbb{P}(S_n >t) \le \inf_{s\ge 0}\frac{\mathbb{E}(e^{sS_n})}{e^{st}} \tag{6}$$
With $(1)$, we compute $\mathbb{E}(sS_n)$ $$\begin{align} \mathbb{E}(e^{sS_n}) &= \mathbb{E}(e^{s\sum_{i=1}^n A_{(i)}B_{(i)}})\\ &= \prod_{i=1}^n \mathbb{E}\left(e^{s A_{(i)}B_{(i)}}\right)\\ &\le \prod_{i=1}^n \mathbb{E}\left(e^{\frac{s}{2} (A_{(i)}^2 +B_{(i)}^2)}\right) = \prod_{i=1}^n \mathbb{E}^2\left(e^{\frac{s}{2} A_{(i)}^2}\right) \tag{7}\\ \end{align}$$
We have, for $i=1,...,n$, applying the Cauchy-Schwazt inequality (at the step 3-4) as in the case 1: $$\begin{align} \mathbb{E}^2\left(e^{\frac{s}{2} A_{(i)}^2}\right) &= \left(\int_{0}^{+\infty} e^{\frac{sx^2}{2}} f_{(i)}(x)dx\right)^2 \\ &=\left(\frac{n!}{i!(n-i)!}\right)^2\left(\int_{0}^{+\infty} x f_X(x)F^{i-1}(x) (1-F(x))^{n-i}dx\right)^2 \\ &=\left(\frac{n!}{i!(n-i)!}\right)^2\left(\int_{0}^{+\infty} (e^{\frac{sx^2}{2}} \sqrt{f_X(x)})\sqrt{f_X(x)}F^{i-1}(x) (1-F(x))^{n-i}dx\right)^2 \\ & \le \left(\frac{n!}{i!(n-i)!}\right)^2 \left(\int_{0}^{+\infty} e^{sx^2} f_X(x)dx\right)\left(\int_{0}^{+\infty} f_X(x)F^{2i-2}(x) (1-F(x))^{2n-2i}dx\right)\\ & = \mathbb{E}\left(e^{sX^2} \right) \left(\frac{n!}{i!(n-i)!}\right)^2 \left(\int_{0}^{1} u^{2i-2} (1-u)^{2n-2i}du\right)\\ \end{align}$$
As $X$ follows the sub-Gaussian distribution, there exists a $\delta <+\infty$ such that $\mathbb{E}\left(e^{\delta X^2} \right)<+\infty$, then, for all $0\le s \le \delta$, $\mathbb{E}\left(e^{s X^2} \right)<+\infty$.
Remark: If $X$ follows the Gaussian distribution, $\delta$ can be $\frac{1}{2\sigma^2}$ where $\sigma^2$ is the variance of $X$.
Return back to $(7)$, we have for all $0\le s \le \delta$ : $$\begin{align} \mathbb{E}(e^{sS_n}) &= \prod_{i=1}^n \mathbb{E}^2\left(e^{\frac{s}{2} A_{(i)}^2}\right) \le \mathbb{E}^n\left(e^{sX^2} \right) \underbrace{\prod_{i=1}^n \left(\left(\frac{n!}{i!(n-i)!}\right)^2 \int_{0}^{1} u^{2i-2} (1-u)^{2n-2i}du\right)}_{\text{Let denote this term by } M}\\ \end{align}$$
From $(6)$, we have $$\mathbb{P}(S_n >t) \le M\cdot \inf_{0 \le s \le \delta} \frac{\mathbb{E}^n\left(e^{sX^2} \right) }{e^{st}} \tag{8}$$
If the exact distribution of the sub-Gaussian variable $X$ is not specified, we can be content to have $$\mathbb{P}(S_n >t) \le M \mathbb{E}^n\left(e^{\delta X^2} \right)\color{red}{e^{-\delta t}} \tag{9}$$
From $(9)$, we can deduce that $S_n$ follows a heavy-tailed distribution.