The loss function of stacked autoencoder

135 Views Asked by At

For a stacked autoencoder as enter image description here

when we train this autoencoder with loses function

assume the vector at layer $i$ is $x_i$.

Our loss function should be only $\|x_5 - x_1\|^2$

or all of loss in symmetric layers: $\|x_5 - x_1\|^2 +\|x_4 - x_2\|^2$?

thanks