Suppose $X_1,X_2,\ldots$ are independent and take values in $[0,1]$. For any coefficients $C_1,C_2, \ldots \in [0,1]$ we have the concentration inequality
$$P \left( \Big| \,\sum_{i=1}^N C_iX_i - \sum_{i=1}^N C_i \mathbb E[X_i]\,\Big|> t\right) \le \exp \left(-\frac{t^2}{2\sum_{i=1}^N C_i^2}\right) \le \exp \left(-\frac{t^2}{2N}\right).$$
In particular take $t = \frac{1}{2}\sum_{i=1}^N C_i \mathbb E[X_i]$ to get for example
$$P \left( \sum_{i=1}^N C_iX_i \ge \frac{1}{2}\sum_{i=1}^N C_i \mathbb E[X_i]\right) \le \exp \left(-\frac{\big(\sum_{i=1}^N C_i \mathbb E[X_i]\big)^2}{8N}\right).$$
Provided $C_i,\mathbb E[X_i]$ do not go to zero too quickly the RHS has order $e^{-cN}$ for some $c >0$. Thus we get a high probability bound for the sum being too far from its expectation.
Now suppose instead of $C_1,C_2,\ldots$ we use random variables $Y_1,Y_2,\ldots$ as coefficients. Under some independence or martingale assumptions it is possible to bound the chance $\sum_{i=1}^N Y_i X_i $ is far from it's expectation. If $X_i,Y_i$ are independent the expectation is $\sum_{i=1}^N \mathbb E[Y_i] \mathbb E[X_i]$.
What I am interested in is subtler. I want to bound the chance $\sum_{i=1}^N Y_i X_i $ is far from $\sum_{i=1}^N \mathbb E[Y_i] X_i$. I don't know how to go about this, or what kind of assumptions are needed. Does anyone have any hints?
Some things I am happy to assume are that $X_1,X_2,\ldots$ are independent and each $X_i,Y_i$ are independent. I am not happy to assume $Y_1,Y_2,\ldots$ are independent or that $Y_i$ is independent of $X_1,\ldots, X_{i-1}$.
Context: This comes up in the analysis of an algorithm. Each $X_i$ is a prediction for the state of the world on day $i$ and each $Y_i$ is part of the action taken on day $i$. Hence $Y_i$ is dependent on $X_i$ directly. Since the actions change slowly $Y_i$ is not independent of the past actions. Hence it is also dependent on $X_{i-1},\ldots, X_1$. The $X_i$ in $X_i Y_i$ measures the performance of the algorithm.