Given an $m\times n$ matrix how would you convert it to polar coordinates?
My understanding is that each element $i,j$ in the matrix relates to a Cartesian point $(x,y)$. The data at $i,j$ is the data to be plotted at $(x,y)$ when the matrix is plotted.
I would like to convert to polar coordinates so that each element $i',j'$ in the resulting matrix relates to a Polar point $(\rho, \theta)$ defined to be the radial distance from the origin and the degree from the positive $x$-axis. The data at $i,j$ would be the data to be plotted at $(\rho, \theta)$ in the properly converted matrix.
Converting the coordinates is easy using the standard method: $$\rho = \sqrt{x^2 + y^2}$$, $$\theta = \arctan\frac{x}{y}$$
How do I correlate the right data from the matrix to these points?
Can I flatten the matrix and interpolate the data within and then match that with the appropriate rho and theta?
How do I determine what to correlate with a given $\rho$ and $\theta$?
I plan to implement this in python so any specific advice is appreciated but at the moment my understanding of the logic is incomplete and that is my issue. If anything is incorrect or more info is needed please let me know and I'll fix my post. Thank you!