Let $t_1,\ldots,t_m$ be $m$ random variables that are independently and identically drawn from a Bernoulli distribution with a constant parameter $p$.
Now, we define some functions of $t_1,\ldots,t_m$, for constant $w_1,\ldots,w_m$, in the following way:
$$f_i(t_1,\ldots,t_m)=t_i\mathbf{I}(1-\sum_{j=1}^m w_jt_j\geq 0)$$
where $\mathbf{I}(x\geq 0)=1$ if $x\geq 0$, and $\mathbf{I}(x\geq 0)=0$ if $x< 0$.
My question: what is the joint expected value of $f_i$'s as a function of $w_1,\ldots,w_m$ and $p$.
$$g_i(w_1,\ldots,w_m,p)=\mathbf{E}[f_i(t_1,\ldots,t_m)]=?$$
Edit:
If $f(t_1,\ldots,t_m)=[f_1(t_1,\ldots,t_m),\ldots,f_m(t_1,\ldots,t_m)]^T$, then I am interested in:
$$g(w_1,\ldots,w_m,p)=\mathbf{E}_{t_1,\ldots,t_m~\text{iid}\sim Bernoulli(p)}[f(t_1,\ldots,t_m)]$$
${\bf E} [f_i(t_1,\ldots,t_m)]$ is the probability that $t_i = 1$ and $\sum_j w_j t_j \le 1$. Conditioning on $t_i$, we get $${\bf E} [f_i(t_1,\ldots,t_m)] = p\; {\bf P}\left\{ \sum_{j \ne i} w_j t_j \le 1 - w_i\right\}$$ I doubt that there's a simpler expression for this. Even determining whether ${\bf P}\{\sum_j w_j t_j =0\} > 0$ (where the $w_j$ are integers), i.e. whether there is a subset of the $w$'s whose sum is $0$, is an NP-complete problem (Subset sum problem).