Let the two functions $q: \mathbb{R}^d \rightarrow\mathbb{R}^{+}$ and $s: \mathbb{R^d} \times \mathbb{R^d} \rightarrow \mathbb{R}^{+}$, $d \in \mathbb{N,}$ where both are assumed to be continuous and sufficiently often differentiable. Furthermore I know, that $\int_{\mathcal{X}}q(x) dx=1$ where $\mathcal{X} \subset \mathbb{R}^d$ is a rectangualar (hyper-rectangle) subset. I want to minimize $$ \int_{\mathcal{X}} q(r) \cdot s_{x}(r) \, dr $$ with respect to $x$. For context: $s_{x}(r)$ denotes the standard deviation of a model I have been developing at a point $r \in \mathbb{R^d}$ if I were to add the point $x \in \mathbb{R^d}$ to my training data. Therefore, what I'm trying to do is finding the next optimal point to add to my training data by finding the minimal expected uncertainty (standard deviation) of some sort.
Is there an analytic way to optimize the function in terms of $q$ and $s$? Both are fairly cheap to evaluate, it's the integral costing a lot of computing power.