Rules for expectations

37 Views Asked by At

I would like to calculate the following expectations of the output of a Neural network. I make the hypothesis that the $E[\beta_i] = 0$ and also I am assuming that the activation $h_j$ is independent of the weights $W_{ij}$. I could write the expectation of the output as follows:

$$E\left[ \left( \beta_i + \sum_{j=1}^{D_h} W_{ij}h_j \right)^2 \right]$$

How Can I calculate further? What do I need to calculate further using my assumptions? Can I do the following?

$$E \left[ \left( \beta_i \right)^2 + 2(\beta_i \sum_{j=1}^{D_h} W_{ij}h_j) + (\sum_{j=1}^{D_h} W_{ij}h_j)^2 \right]$$ $$=E[(\beta_i)^2] + E[2(\beta_i \sum_{j=1}^{D_h} W_{ij}h_j)]+ E[(\sum_{j=1}^{D_h} W_{ij}h_j)^2]$$

Then, are the first and second expressions equal to zero? If yes, why? And how can I expand the third term further?