As artificial intelligence has gotten fancier and fancier these last few years, I would like to explore a little the issue of the structure of neural networks. First I would like to know if there is a canonical notion of distance between neural networks such that the distance between A and B is maximal when A is isomorphic as a graph to the complement of B. If yes, are there results about the distribution of distances between subneural networks of a given neural network and its relation to the range of tasks it can perform efficiently ?
I'm also interested in the relation between the symmetry group of a subneural network when its vertices are mapped to a regular lattice and its efficiency in solving problems whose structure admits the same symmetry group (for example, is a neural network more efficient in identifying shapes like ellipses or rectangles if it itself admits a Klein group as symmetry group in this framework) ?