Is the following true for discrete random variables $K,Y,X_1,...$ where $K$ is independent of $\sigma(Y,X_1,...)$?
$$E[\sum_{i=1}^K X_i | K,Y] = \sum_{i=1}^K E[X_i|K,Y] = \sum_{i=1}^K E[X_i|Y]$$
Intuitively it seems ok, but I am lacking the rigorous argument. We only know when to drop a condition in conditional probability expressions like $P[X_i|K,Y] = P[X_i|Y]$.
Thinking of independence as "$K$ doesn't tell us anything about $Y,X_1,...$" should then also allow to drop the condition in the conditional expectation. Is it also rigorously speaking true?
draft (author is unsure about the following try):
Let's consider the factorization $E[\sum_{i=1}^K X_i\ |\ K,Y] = \underbrace{E[\sum_{i=1}^K X_i\ |\ (K,Y) = (\cdot,\cdot)]}_{(1)} \circ (K,Y)$.
Where we obtain (1) by
$$E[\sum_{i=1}^K X_i | K=k,Y=y] = \sum_{i=1}^k E[X_i|K=k,Y=y] = \sum_{i=1}^k E[X_i|Y=y]$$
Taking together we have $$E[\sum_{i=1}^K X_i\ |\ K,Y] = \sum_{i=1}^K E[X_i|Y]$$
It might help to rewrite the random sum as $$ S=\sum_{k=1}^\infty\mathbf 1_{K\geqslant k}\,X_k. $$ Since every $\mathbf 1_{K\geqslant k}$ is measurable with respect to $\sigma(K,Y)$, $$ E(S\mid K,Y)=\sum_{k=1}^\infty\mathbf 1_{K\geqslant k}\,E(X_k\mid K,Y)=\sum_{k=1}^KE(X_k\mid K,Y). $$ This does not use the independence argument. All that is needed is that, for every $U$ measurable with respect to the sigma-algebra $\mathcal G$ and every $V$, $$ E(UV\mid\mathcal G)=UE(V\mid\mathcal G). $$ To go further, note that, for every sigma-algebras $\mathcal G$ and $\mathcal K$, if the sigma-algebras $\mathcal K$ and $\sigma(\sigma(X),\mathcal G)$ are independent, then $E(X\mid\mathcal G,\mathcal K)=E(X\mid\mathcal G)$.
Under the hypothesis that $K$ is independent of $(Y,(X_k)_{k\geqslant1})$, one can apply this general result to $X=X_k$, $\mathcal G=\sigma(Y)$ and $\mathcal K=\sigma(K)$, getting $$E(X_k\mid K,Y)=E(X_k\mid Y),$$ for every $k$. Thus, $$ E(S\mid K,Y)=\sum_{k=1}^KE(X_k\mid Y). $$ As a final caveat, note that if the random variables $(X_k,Y)$ are identically distributed, then $E(X_k\mid Y)$ does not depend on $k$, for example $E(X_k\mid Y)=E(X_1\mid Y)$ for every $k$ hence $$E(S\mid K,Y)=K\,E(X_1\mid Y),$$ but that the hypothesis that the sequence $(X_k)_{k\geqslant1}$ is i.i.d. is not enough to guarantee this.