Consider some random variable $X \in \mathbb{R}^d$ with distribution function $p=p_X$. Suppose we have information on marginals $Y_i = X^TA_i < \alpha_i, i=1,2,3\dots n$ with $A_i \in R^d$, and want to weight the maginals in some way to obtain an estimate of the distribution $p$.
We can write the distibution $p_i = p_{Y_i} = \mathcal{A}_i \circ p = \mathbb{E}(\mathbb{1}\{X^TA_i < \alpha_i\}) = \int \mathbb{1}\{x^TA_i < \alpha_i\} p(x) \text{d}x$ as a linear functional of the probability $p$.
We might try to solve the following minimization problem: $$ \text{arg min}_{p} \sum_i w_i B_{\phi}\left(q_i, \mathcal{A}_i \circ p\right) $$ which we could rewrite in terms of a discrete (Dirichlet type) random variable in the first argument however this is still not a typical form since the second argument varies with $i$: $$ \text{arg min}_{p} \mathbb{E}_I\left(B_{\phi}\left(q_I, \mathcal{A}_I \circ p\right)\right) $$
Is there any approach to solving this or transforming it into some shape where some of the standard Bregman projection tricks can work? Or anything else to solve this other than a rather brute force implementation solving some nonlinear optimization for a parametrization of $p$?