I am working on a music synthesizer project. We are just kind of playing around trying to get an interesting way to vary N parameters with a nice balance of predictability and surprise. I had an idea to choose a random orthogonal (complex) NxN matrix $\mathbf{A}$, and then solve the linear ODE $y' = \mathbf{A}y$ (using Euler's method). But something strange happens -- there are a few seconds of interesting semi-chaos, but then things quickly seem to start to settle into a very simple periodic pattern, with all parameters getting stuck in lock-step with each other, all with the same period and even most with the same phase.
My question is, is this behavior some interesting property of ODEs on random matrices, or should this not be happening and my solver has some problem? What is the typical behavior of $e^{\mathbf{A}t}$ for $\mathbf{A}$ random and orthogonal?
Possibly important detail(?): after each step of Euler's method, I normalize the vector (otherwise I start getting exponential blowup from rounding errors). $\mathbf{A}$ is orthogonal so in a perfect world this wouldn't do anything. But I'm intuitively worried this normalization step is causing some damping of the system that is suppressing its interesting dynamics.
You have a slight error in thinking about this. To get a process that keeps the length of the vector in an exact solution, you want an anti- or skew-symmetric matrix $X^T=-X$. Then you can apply the Euler method with projection to the unit sphere.
With an orthogonal matrix $A$ you can do a discrete iteration $x_{k+1}=Ax_k$. This might not produce a path that is recognizably continuous.
To combine both approaches, generate the skew-symmetric matrix $X$, which should be easier than producing an orthogonal one, and set $A=(I+hX)(I-hX)^{-1}$ for a combined step size of $2h$. This can also be interpreted as combination of explicit and implicit Euler step, or as the implicit midpoint method. One can easily check that $AA^T=I$, so that the step matrix is indeed orthogonal.
Another way to generate a random orthogonal matrix with small steps is to randomly generate Givens rotations with small angles and compose them into one matrix. (Or keep the chain of rotations as it is and apply it piece-by-piece to the vector. This is useful if the number of randomly placed rotations is much smaller than $n^2/2$.)
What you observe in numerically solving $\dot x=Ax$ is similar to what happens in the power iteration, the dynamic of the process concentrates with time on the largest eigenvalues of $A$, here on the eigenvalues with largest (usually positive) real part. The growth in their eigenspace is relatively faster than the growth in any other eigenspace, and with the rescaling that you perform, reduce the size of the contributions of the other eigenspaces also absolutely.
So what remains after some time is an oscillation with the frequency given by the imaginary part of the eigenvalue with largest real part. As the eigenvalues of an orthogonal matrix lie all on the unit circle, the dominant frequency of a random matrix will usually be low, as the conjugate eigenvalue pair will be close to the extremal point $1+0i$.