I am looking at the orientation of molecules with respect to an interface as a function of their distance from the interface. The molecules are approximately linear, so I measure the angle theta between a linear vector from one atom to another, and the interface normal. I do this using the dot product. I then plot a 2d histogram of the count of each angle, with cos(θ) on the y-axis and the distance from the interface on the x-axis. I have checked that my code correctly calculates the angles many times and it does. However, in my 2D histograms there is a stripe of increased density of cos(θ)=0. The stripe is very uniform (no gradient from 0, just a stripe) and I get it in every system (see image below 1). Here is a snippet of my code:
''' def unit_vectors(vectors): norms = np.linalg.norm(vectors, axis=2) return vectors / np.expand_dims(norms, axis=2)
def calculate_angles(vectors): dot_products = np.sum(unit_vectors(vectors) * reference, axis=2) return dot_products '''
At first, I thought this was a symmetry artifact in measuring angles between 0 and 180. Almost as if the molecules at 90 are being 'overrepresented' since they are being plotted only in one place, while molecules at 45 and 135, for example, are at the same orientation but are being plotted in two separate regions of the plot (depending on the arbitrary vector direction). This would make sense, as I see roughly double the number of molecules at 90 degrees throughout the system compared with all other angles. However, the 'stripe' has a thickness of about cos(θ)=-0.04 to cos(θ)=0.04, so now I am unsure?
I am aware of the uniform random orientation in space is distributed according to sin(θ) of angles between two 3D vectors as discussed in Angular probability distribution between 3D vectors?. This problem seems distinct from that due to the sharp preference at 90. The problem doesn't go away when I normalise by sin(θ).
Any help would be much appreciated!