Variance of amplitude and phase from sin and cos regressors in polar coordinates

526 Views Asked by At

On a data set, I estimated the sine and cosine weights at a specific frequency, $\beta_{\sin}$ and $\beta_{\cos}$. I can extract the amplitude and phase from these regressors as follows:

$$\textrm{Amplitude} = \sqrt{\beta_{\sin}^2 + \beta_{\cos}^2}$$

$$\textrm{Phase} = \arctan\left(\frac{\beta_{\sin}}{\beta_{\cos}}\right)$$

Associated with $\beta_{\sin}$ and $\beta_{\cos}$ are variances of the parameter estimates. How do I properly convert these to polar coordinates?

Thanks

1

There are 1 best solutions below

0
On

$\newcommand{\var}{\operatorname{var}}\newcommand{\cov}{\operatorname{cov}}$For large sample sizes ("How large is 'large'?" is a question that could bear examination), the function $(\beta_{\cos},\beta_{\sin})\mapsto\arctan(\beta_{\sin}/\beta_{\cos})$ can be treated as locally linear. Letting $\hat\beta$ be the least-squares estimate of $\beta$, we have $$ \frac{\partial}{\partial\beta_{\sin}} \arctan\frac{\beta_{\sin}}{\beta_{\cos}} = \frac{\beta_{\cos}}{\beta_{\cos}^2+\beta_{\sin}^2} \quad\text{ and }\quad \frac{\partial}{\partial\beta_{\cos}} \arctan\frac{\beta_{\sin}}{\beta_{\cos}} = \frac{-\beta_{\sin}}{\beta_{\cos}^2+\beta_{\sin}^2}. $$ So we have a product of matrices, $(1\times2)(2\times2)(2\times 1)=\text{a scalar}$: \begin{align} & \var \arctan\frac{\hat\beta_{\sin}}{\hat\beta_{\cos}} \\[12pt] \approx {} & \left(\frac{(\beta_{\cos},\beta_{\sin})}{\beta_{\cos}^2+\beta_{\sin}^2}\right) \begin{pmatrix} \var\hat\beta_{\sin} & \cov(\hat\beta_{\sin},\hat\beta_{\cos}) \\ \cov(\hat\beta_{\cos},\hat\beta_{\sin}) & \var(\hat\beta_{\cos}) \end{pmatrix} \left(\frac{1}{\beta_{\cos}^2+\beta_{\sin}^2} \begin{pmatrix} \beta_{\cos} \\ \beta_{\sin} \end{pmatrix} \right). \end{align}

A similar thing applies to the amplitude, but the derivatives are of course different.

PS: If $\displaystyle\sum_x \sin x\cos x=0$, where the sum is over all of the observed $x$-values in the data set, or in other words the sine and cosine are orthogonal, then the covariances are $0$, provided one has the usual assumptions about errors being uncorrelated and homoskedastic. That makes the product of matrices equal to $$ \frac{\beta_{\cos}^2\var\hat\beta_{\sin} + \beta_{\sin}^2\var\hat\beta_{\cos}}{\beta_{\cos}^2+\beta_{\sin}^2}. $$

A similar thing can be done with the amplitude, but then of course the partial derivatives would be $(\partial/\partial\beta_{\cos})\sqrt{\beta^2_{\cos} + \beta^2_{\sin}}$ and $(\partial/\partial\beta_{\sin})\sqrt{\beta^2_{\cos} + \beta^2_{\sin}}$.