Are Neural Networks continuous?

127 Views Asked by At

Definition: We define a neural network with network architecture $(L,p)$ and activation function $\sigma: \mathbb{R} \to \mathbb{R}$ as the mapping $g : \mathbb{R}^{p_0} \to \mathbb{R}^{p_{L+1}}$ with \begin{align} g(x) = W^{(L+1)}\cdot \sigma(v^{(L)}+W^{(L)}\cdot \sigma(...v^{(2)}+W^{(2)}\cdot \sigma (v^{(1)}+W^{(1)}\cdot x)...)), \end{align} where we write $\sigma(v)=(\sigma(v_1),\dots,\sigma(v_n))$ for $v\in\mathbb{R}^n$ for arbitrary $n\in \mathbb{N}$.

Now, if we insert a continuous activation function, the neural network should also be continuous. I guess the reasoning behind this would be that vectors and matrices are linear mappings and therefore continuous. Thus a neural network is just a combination of continuous functions and hence continuous itself. Am I right?