I'm thinking about a certain problem regarding the existance of structures in random graphs, and somwhow it comes down (after some simplifications) to the following probabilistic problem.
Assume we have $n$ balls, represented by $n$ random variables $C_1,\dots,C_n$. We also have two people, named Alice and Bob. Each ball is given to Alice with probability $\mathbb{P}(C_i=A)=p_n$, to Bob with probability $\mathbb{P}(C_i=B)=p_n$ and is thrown away with probability $\mathbb{P}(C_i=T)=1-2p_n$. We may assume that $p_n=\lambda n^{-1}$ if that somehow helps. Moreover, we know that each set of at most $k_n$ balls is independent (for my practical purposes, $k_n\approx \ln (n)$).
Let $X=\sum_{i=1}^n 1_\left \{ C_i=A \right \}$ be the number of Alice's balls and $Y=\sum_{i=1}^n 1_\left \{ C_i=B \right \}$ be the number of Bob's balls. My question is: can we say something about the independence of $X,Y$? If so, I expect it to hold only for values of $X,Y$ that are at most $k_n$, naturally. More formally, is there a way to achieve an asymptotic bound on $$\left | \mathbb{P}(X=a,Y=b)-\mathbb{P}(X=a)\mathbb{P}(Y=b) \right |$$for $0\leq a,b\leq k_n$ or anything like that?
Since the notion of independence we have works well with expectations, I tried to play around with the moments to maybe generalize well known moment methods, but having only $\ln(n)$ made that seemingly infeasable. Your thoughts about this will be welcomed.
Not an answer. I'd try to code the results as $x_i$ with $Alice \to 1$, $Bob \to-1$, $T \to0$ , so that with $P(x_i = 1)=p=P(x_i = 1)$ and $P(x_i=0)=1-2p$.
Then $x_i$ are $k-$wise independent, and $E[x_i]=0$ and $E[X_i^{2t}]=2p$. Also
$$A = \frac12 (\sum x_i^2+\sum x_i)$$ $$B = \frac12 (\sum x_i^2-\sum x_i)$$
This immediately shows that $A$ and $B$ are uncorrelated, and $E[A^r B^s] = E[A^r] E[B^s]$ for $r+s\le k$.
But there is long way from here to your desired result...