A similar question has been asked before in this link, however the first answer does not address the point and considers an approach which is not related to the question, in my opinion.
To formulate my question in very simple terms, assume we have a vector $\mathbf{X} = X_1,\dots,X_k$ random variables all independent with probability densities $X_i \sim p_{X_i}(x) = N(\mu_i,\sigma_i^2)$ . Let $S = \sum_{i=1}^{k}X_i$, we know that the mean of $S$ is: \begin{align} E[S] = \sum_{i=1}^{k}\mu_i := \mu_s \end{align} and the variance \begin{align} Var(S) = \sum_{i=1}^{k}\sigma_i^2 := \sigma_s^2 \end{align}
This can be done by applying the expectation and variance operations on each variable independently, or to be more exact, computing the following integral for the expectation case: \begin{align} E[S] = \int\dots\int(x_1+\dots+x_k)p_{X_1}\dots p_{X_k}dx_1\dots dx_k \end{align} Equivalently, this is saying that $S$ has distribution $p_{S}(s) = N(\mu_s,\sigma_s^2)$ and we have: \begin{align} E[S] = \int s p_S(s)ds \end{align} My question is, which is similar to what is in the link and probably very trivial although I can't find a good reference, is there an analytic way (different than the above probabilistic way) to show that: \begin{align} \int s p_S(s)ds = \int\dots\int(x_1+\dots+x_k)p_{X_1}\dots p_{X_k}dx_1\dots dx_k \end{align}
In some sense, we have a map $S = f(\mathbf{X}) = a^T \mathbf{X}$ and $a$ is a vector of one's, but this map is not invertible so applying the standard change of variable technique does not work, the author in the below link references some work that say why its difficult but it's not clear to either of us. I do not know much about this area, but can this be achieved through the theory of "differential forms"?
$p_S(s)$ is defined, by independence and the Law of Total Probability:
$\begin{align}p_S(s)&:=\iiint_{\Bbb R^{n-1}} \left(\prod_{j=1}^{n-1} p_{X_j}(x_j)\right)p_{\small X_n\!\!}\left(s-\sum_{i=1}^{n-1} x_i\right)\,{\mathrm d(x_k)}_{k=1}^{n-1}\\\therefore\\\mathsf E(S) &= \int_{\Bbb R} s\iiint_{\Bbb R^{n-1}} \left(\prod_{j=1}^{n-1} p_{\small X_j}(x_j)\right)p_{\small X_n\!\!}\left(s-\sum_{i=1}^{n-1}x_i\right)\,\mathrm d(x_k)_{k=1}^{n-1}\mathrm d s\\&=\iiint_{\Bbb R^n} \left(\sum_{i=1}^n x_i\right)\left(\prod_{j=1}^np_{\small X_j}(x_j)\right)\,\mathrm d (x_k)_{k=1}^n&&\small x_n=s-\sum_{i=1}^{n-1}x_i\\&=\sum_{i=1}^n\iiint_{\Bbb R^n}x_i\left(\prod_{j=1}^np_{\small X_j}(x_j)\right)\,\mathrm d (x_k)_{k=1}^n\\&=\sum_{i=1}^n\int_\Bbb R x_i\, p_{\small X_i}(x_i)\,\mathrm d x_i\\&= \sum_{i=1}^n\,\mathsf E(X_i)\\[2ex]\mu_{\small S}&=\sum_{i=1}^b\,\mu_{\small X_i}\end{align}$