I have summation as given below $$\sum_{i}x_iy_i$$ where $y_i$ are independent random variables and $x_i$ may depend on each other but they are independent of $y_i's$. I am interested in finding the expectation $$E[\sum_{i}x_iy_i\quad |\quad x_iy_i\leq c, \forall i]$$where $c$ is some positive constant. Now I write $$E[\sum_{i}x_iy_i\quad |\quad x_iy_i\leq c, \forall i]=E_{x_i}\left[\sum_i x_i E_{y_i}\left[y_i1(y_i\leq cx_i^{-1})\right]\right]$$ I want to ask is it right thing to do? Or there should be a division by some probability (e.g. the probability that $y_i\leq cx_i^{-1}$).
Here is the first paper that discusses this step (https://curate.nd.edu/downloads/df65v694z66) Similar step can also be found in (https://www3.nd.edu/~mhaenggi/pubs/now.pdf, pp. 206, and http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7412759, pp. 3961 and steps to get equation number 39)
In the above image $\Phi_y=\emptyset$ means that $h_xl(x-z)<y$ for all $x \in \Phi$.
For this case if we put $P_d=1,c=T$ then $x_i$ map to $d_i^{-\alpha_d}$ and $y_i$ map to $g_i$. And here we do not have any particular sign over the expectation.


Let $W$ be a random variable, $\vec{Z}$ a random vector, and $\mathcal{A}$ an event with $P[\mathcal{A}]>0$. Some formulas that may help are:
1) Indicator function: $$ \boxed{E[W|\mathcal{A}] = \frac{E[W1\{\mathcal{A}\}]}{P[\mathcal{A}]}} $$ So indeed we would need to divide by $P[\mathcal{A}]$ when using this indicator function formula.
2) Iterated expectation: $$ \boxed{E[W] = E[E[W|\vec{Z}]]} $$ If PDFs exist and $Z$ is 1-dimensional then this is the same as $$ E[W] = \int_{z=-\infty}^{\infty} E[W|Z=z]f_{Z}(z)dz $$
3) Conditioning on a world where $\mathcal{A}$ occurs: $$ \boxed{E[W|\mathcal{A}] = E[E[W|\vec{Z}, \mathcal{A}]|\mathcal{A}]} $$ So conditioning on $\mathcal{A}$ must appear everywhere, and can only be removed or modified if more info is given. If PDFs exist and $Z$ is 1-dimensional, this is the same as: $$ E[W|\mathcal{A}] = \int_{z=-\infty}^{\infty} E[W|Z=z,\mathcal{A}]f_{Z|\mathcal{A}}(z|\mathcal{A})dz$$
For your problem you can use either formula 1 or formula 3, but formula 3 is more direct. As suggested by Did, for each $i \in \{1, …, n\}$ define $\mathcal{A}_i = \{X_i Y_i \leq c \}$. Define $\mathcal{A} = \cap_{i=1}^n \mathcal{A}_i$.
Via formula 3:
Fix $i \in \{1, …, n\}$. Then:
\begin{align} E[X_iY_i|\mathcal{A}] &= E[E[X_iY_i|X_i, \mathcal{A}]|\mathcal{A}]\\ &= E[X_i E[Y_i | X_i , \mathcal{A}]|\mathcal{A}] \\ &= E[X_i E[Y_i | X_i, \mathcal{A}_i]|\mathcal{A}] \end{align} where the last equality uses $$E[Y_i|X_i, \mathcal{A}]=E[Y_i|X_i, \mathcal{A}_i]$$ which follows from the independence properties of $Y_i$.
Via formula 1:
Fix $i \in \{1, …, n\}$. Then:
\begin{align} E[X_iY_i | \mathcal{A}] &= \frac{E[X_i Y_i 1\{\mathcal{A}\}]}{P[\mathcal{A}]} \\ &= \frac{E[E[X_i Y_i 1\{\mathcal{A}\} | X_i, 1\{\mathcal{A}\}]]}{P[\mathcal{A}]}\\ &= \frac{E[X_i 1\{\mathcal{A}\} E[Y_i | X_i , 1\{\mathcal{A}\}]]}{P[\mathcal{A}]}\\ &=\frac{E[X_i 1\{\mathcal{A}\} E[Y_i | X_i , \mathcal{A}_i]]}{P[\mathcal{A}]}\\ &= E[X_i E[Y_i | X_i, \mathcal{A}_i]|\mathcal{A}] \end{align} where the second-to-last equality uses $$ 1\{\mathcal{A}\} E[Y_i | X_i , 1\{\mathcal{A}\}] = 1\{\mathcal{A}\} E[Y_i | X_i , \mathcal{A}_i] $$ which again follows from the independence properties of $Y_i$.
Proof of formula 3:
Formula 3 is consistent with formula 1, indeed we get: \begin{align} E[W|\mathcal{A}] &\overset{(a)}{=} \frac{E[W 1\{\mathcal{A}\}]}{P[\mathcal{A}]} \\ &= \frac{E[E[W 1\{\mathcal{A}\}|\vec{Z}, 1\{\mathcal{A}\} ]]}{P[\mathcal{A}]} \\ &= \frac{E[1\{\mathcal{A}\}E[W|\vec{Z}, 1\{\mathcal{A}\}]]}{P[\mathcal{A}]} \\ &\overset{(b)}{=} \frac{E[1\{\mathcal{A}\}E[W|\vec{Z}, \mathcal{A}]]}{P[\mathcal{A}]} \\ &\overset{(c)}=E[E[W|\vec{Z}, \mathcal{A}]|\mathcal{A}] \end{align} where (b) uses $$ 1\{\mathcal{A}\} E[W|\vec{Z}, 1\{\mathcal{A}\}] = 1\{\mathcal{A}\} E[W|\vec{Z}, \mathcal{A}] $$ and where (a) and (c) both use formula 1.