I have one question about the algorithm on testing the Trivalent Graph isomorphism in polynomial time. The paper "Isomorphism of graphs of bounded valence can be tested in polynomial time" by Luks was complicated to me. As I understood, given 2 Trivalent Graphs, we can test whether they are isomorphism in polynomial time. My question is, if they are isomorphic, can we find the isomprophism? Thank you.
Link of the paper "Isomorphism of graphs of bounded valence can be tested in polynomial time" by Luks https://ix.cs.uoregon.edu/~luks/iso.pdf
I believe there's a standard trick for turning an algorithm for checking if 2 graphs are isomorphic into one that gives such an isomorphism. It'll be slightly complicated by the fact that our algorithm only works for trivalent graphs.
The idea is thus: we delete parts of the graph and re run our algorithm. If ever we get 2 non isomorphic graphs, we know we cannot map the respective parts to each other. This let's us do a greedy search of the isomorphisms.
One should be concerned that having isomorphic subgraphs after deleting some vertices is only necessary for an isomorphism mapping the two. It may happen on accident. I don't immediately recall a proof that this is sufficient, but it's relatively easy to mark vertices so that they have to be mapped to each other. Usually we do things like adding a bunch of self loops to a vertex, or attaching a copy of $k_n.$ In our case, we can do things like attach a chain of suitable length to the vertices that need mapped to each other.
I'm going to assume we can have self loops and multiple edges, and that loops count as 2 outgoing edges. If not, the construction changes a bit, but nothing important breaks.
Fix our two graphs G,H which are isomorphic. Say they have $n$ vertices. Let A(G,H) be our algorithm for testing trivalent isomorphism. For our later use, suppose there are no $k$ vertices $v_1, \ldots, v_k$ with 2 edges between $v_{2n-1}, v_{2n},$ 1 edge between $v_{2n}, v_{2n+1},$ and a self loop on $v_k.$ (we can compute this quickly, just check distance $\leq k$ neighbors of vertices with one self loop) Call this pattern a chain of length $k,$ and note $k$ should be odd.
Fix a vertex $r \in G.$ It has at most $3$ neighbors, say $r_i.$ construct a new trivalent graph G' by deleting $r$ and attaching a $k$ chain for each of the $r_i,$ with the base $v_1$ connected to $r_i.$ So G' should have $3k + n-1$ vertices.
Now, for every vertex $h \in H,$ we're going to do the exact same construction as though $h$ were $r$ above. We'll get new graphs $H_h,$ and run $A(G', H_h).$ Some good things happen:
Since G, H have no $k$ chains, any isomorphism of $G', H_h$ will have to send the $r_i$ to the neighbors of $h.$ Otherwise we'd have an isomorphism sending some vertex adjacent to a k-chain to a vertex that isn't, which is absurd.
Since G, H are isomorphic, there is some $G', H_h$ which are too. In fact, if $\phi:G \rightarrow H$ then $G'$ is isomorphic to $H_{\phi(r)}.$
These facts together imply that if we can find an isomorphism of G' and $H_h,$ then we can construct an isomorphism of $G$ and $H$ by taking the same map (which defines a bijection between all the vertices of G except r with all the vertices of $H_h$ except h) and adding the assignment $\phi(r)=h.$
So now we repeat! Keep track of the chains we've added and don't ever choose them for $r$ or $h,$ on the ith step, attach $k+ni$ length chains to the neighbors of the deleted vertices (these are long enough that the chains can't be mixed up with previously added chains. If this makes one nervous, replace chains on the ith step with chains with a foot sticking out of the ith from ending vertex, or anything else that's not isomorphic to a subset of the rest of the graph). Terminate when only one vertex from G (and H) remains, the rest being chains we've added.
In the worst case, we have to call our algorithm $n$ times per step on graphs as big as $n+3k+3k+6n + \ldots = O(n + 3n^3k) = O(n^4)$ (hence a composition of two polynomials, so still polynomial), all for $n$ steps. If the time complexity of $A$ is $P(n)$ then our total time complexity is $O(n^2 P(n^4)).$
I imagine there are several efficiency improvements that could be made to this reduction. Though I suspect in practice it's easier to keep track of the relevant data inside of the algorithm $A.$