We all have a good interpretation of orthogonal vectors in Euclidean Space, how does this extend to Complex vectors? Algebraically, the thing that bugs me, is that the dot product is only symmetric up to a conjugate-which seems strange to me.
I considered making a system of equations to solve for an example of two orthogonal complex vectors, but I wasn't sure if it was going anywhere. I said $[a+bi,c+di]*[e+fi,g+hi]=0$, which got me to $eb-af+dg-hc=ae+fb+gc+hd$ which I think is right. Does anyone have an intuitive way of thinking about orthogonality in a complex space?
(Disclaimer: this area is far from my domain of knowledge.)
You made a mistake with your inner product. You should get two equations, one for the complex component, and one for the real component.
If you're looking at the inner product of two vectors, you can choose a basis such that one of the vectors is of the form $[a+bi,0]$. Your inner product then becomes $[a+bi,0]∗[e+fi,g+hi]=0$. Then it's clear that $g=0, h=0$. Orthogonality doesn't change much in a complex vector space compared to a real one. The inner product of orthogonal vectors is symmetric, since the complex conjugate of zero is itself.
What's trickier to understand is the dot product of parallel vectors. Personally, I think of complex vectors more in the form $[R_ae^{i\theta_a},R_be^{i\theta_b}]$. If we imagine the dot product of two parallel vectors (again choosing a convenient basis):
$[R_1e^{i\theta_1},0]*[R_2e^{i\theta_2},0]=R_1R_2e^{i(\theta_1-\theta_2)}$
It's clear that their magnitudes multiply, just like with vectors over the reals. The key difference is that the inner product also has a phase component. Thus the inner product tells you to what degree two vectors are parallel, and also to what degree the parallel components are in phase.