Integrate composite of non-linear function neural network

100 Views Asked by At

I need help integrating the following output of a 1 hidden layer auto-encoder neural network, a.k.a. integrating the reconstruction function. The encoder uses a relu function and so does the decoder.

Any ideas on how to do this?

$$\int relu\left ( W^T relu \left (Wx+b \right )+b' ) \right ) dx$$ where $W^T, W$ are matrices, $x, b,b'$ are vectors, and $relu(a) = \max(0,a)$.