Graph of output vs. weighted sum
The linear activation function $$ z = b+\sum_{i=0}^k x_iw_i $$
Is used in simple linear neurons in a neural network. Graphs of the z v. the weighted sum of the inputs plus the bias show the line passing through (0, 0), the origin. Is this a strict requirement? Can the line not pass through the origin?
2026-03-24 22:09:31.1774390171
Linear Activation Function in Neural Network
73 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
This is not a strict requirement for activation functions in general, but for the simple linear neuron it is. For instance, with the sigmoid activation function, we typically have a $y$-intercept of $1/2$, but for a linear activation $0$ is typical.