I am trying to implement genetic-type algorithms on unitary matrices. Hopefully I should be able to use this question for the mutation part. But I am having an issue with the cross-over step. So here is my question:
Given two unitary matrices $A$ and $B$(both square matrices with the same size, and their elements belong to $\mathbb C$), what is a clever way of combining them so the result would still be unitary. Where, by clever I mean if the parents $A$ and $B$ are close to each other, the child would also be close to them. I can give a precise notion of what I mean by being close(if necessary), but I think most of the reasonable notions would satisfy me.
Besides the trivial choice of $A^iB^j$, I actually haven't been able to construct any unitary operators from $A$ and $B$ yet; and $A^iB^j$ is not necessarily close to $A$ or $B$.
I don't know if this is asking too much, but since the unitary matrices I'm dealing with would scale like $2^n\times2^n$(where $n>10$), I guess another aspect of being clever, should be the efficiency of the algorithm.
Edit:
So I thought about a way of generating off-springs. I think it satisfies the first requirement of being clever, however I'm not sure how efficient it could be implemented. It is built upon the fact that if $H$ is Hermitian, then $e^{iH}$ would be unitary. So this is how it works: $$A'=e^{i\alpha(B-A+B^\dagger-A^\dagger)}A \\ B'=e^{i\alpha(A-B+A^\dagger-B^\dagger)}B$$ where $\alpha$ is a real number.
I don't think this method is unique or perfect, and I would appreciate any improvements to it.
Just a sketch/idea for real-valued (orthonormal) matrices:
1) it's enough to define your operation for $I$ and $Q$. (E.g. we can let $Q = AB^{-1}$ and then multiply the final result by $B$.
2) $Q$ is a rotation matrix. Define $R_a(Q), 0\le a \le 1$, as the rotation performed not the whole way, but only $a$ fraction of the way. If $a=0$, the result is $I$. If $a=1$, the result is $Q$.