Let $\mathcal{M} = \{p_\theta := p(\cdot | \theta), \theta \in \Theta\}$ be a statistical manifold with Fisher information metric: $$g_{{jk}}(\theta )=\operatorname {E} \left[\left({\frac {\partial }{\partial \theta _{i}}}\log p(X;\theta )\right)\left({\frac {\partial }{\partial \theta _{j}}}\log p(X;\theta )\right) \right].$$
The Wikipedia article on the topic derives the metric form Euclidean metric by changing variables. I can understand the procedure but I have questions related to the flatness of $\mathcal{M}$. In Amari's book; "Information Geometry and its application", it is said that such manifold is flat (dually flat actually) so
1- Is the above derivation enough to conclude that the manifold is flat (I mean the fact that the metric is derived from the Euclidean metric)?
2- Is there a straightforward way to show that the curvature is 0 everywhere?
This is a common source of confusion.
The flatness in information geometry refers to the dual (with respect to the Fisher-Rao metric) connections $\nabla$ and $\nabla^*$, called exponential and mixture connections, and not to the metric or Levi-Civita connection. For the latter, please see Bruveris, M., & Michor, P. W. (2019). Geometry of the Fisher-Rao metric on the space of smooth densities on a compact manifold. Mathematische Nachrichten, 292(3), 511–523. https://doi.org/10.1002/mana.201600523
Finite-dimensional statistical manifolds can have negative curvature, for example the information manifold of 1-dimensional Gaussians is isometric to the Poincaré half-plane.