why neuron network's parameter space has a Riemann metric structure?

43 Views Asked by At

In the natural gradient paper, the first line writes:

The parameter space of neural networks has a Riemannian metric structure.

I don't know why. I has basic knowledge of Riemannian metric: it defines the metric of local space of riemanian space. And I know riemannian space is a curved space. But I don't get it that how is the NN parameter space a curved space rather the Euclidean space?


I have read the correspoonding material mentioned in the paper briefly, it seems to regard NN as a probability density functions which is in L1 space, suggesting that the parameter space is constrained to make the function be in L1. I am fine with the explaination. But as I know, the parameters of a network is not necessarily under some constraint. And I doubt NN is a L1 function because usually we just apply softmax layer to the raw output of the NN to make the output within (0,1).