Expected Value of neural network output

470 Views Asked by At

I'm attempting to find a closed form expected value of a particularly messy expression involving dot products of multiple neural network outputs.

A sub-problem of this is to consider just a single feed-forward network.


Consider the feed-forward neural network with weight matrices $W_1 \in \mathbb{R^{k_2 \times k_1}}$, $W_2\in \mathbb{R^{k_3 \times k_2}}$, and input $x\in \mathbb{R^{k_1}}$. Treat each component of these as iid random variables normally distributed with the following means, $\mu_{W_1}, \mu_{W_2}, \mu_{x}$ and variances, $\sigma_{W_1}, \sigma_{W_2}, \sigma_{x}$, respectably. E.g. $[W_1]_{i,j} \sim \mathcal{N}(\mu_{W_1},\sigma_{W_1})$.

The neural network output would be

$$ o(W_1,W_2,x) := g(W_2 g(W_1x))$$

where $g$ is the relu function.

What then would be its expected value?

$$ E[g(W_2 g(W_1x))]$$

I know this can fairly easily be calculated through Monte Carlo methods, but is it possible to come up with a closed-form expression or a closed-form approximation?