I'm studying the book all statistics by Wasserman and I'm trying to prove the theorem following the definition of the Wald Test on page 153.
10.3 Definition. The Wald Test
Consider testing
$H_0:\theta = \theta_0\ \text{versus}\ H_1 : \theta = \theta_ 0$
If $H_0$ is true, assume that $\hat{\theta}\ \text{is asymptotically Normal:}$
$\frac{\hat\theta-\theta_0}{\hat {se}}\xrightarrow{(d)} \mathcal N(0,1)$
The size $\alpha$ Wald test is: reject $H_0$ when $|W|>z_{\alpha/2}$ where
$W=\frac{\hat{\theta}-\theta_0}{\hat{se}}$
Now he states this theorem, but without prove it:
10.6 Theorem.
Suppose the true value of $\theta$ is $\theta_* \neq \theta_0$ . The power $\beta(\theta_*)$ — the probability of correctly rejecting the null hypothesis — is given (approximately) by
$$1 - \Phi\bigg(\frac{\theta_0-\theta_*}{\hat{se}}+z_{\alpha/2}\bigg) + \Phi\bigg(\frac{\theta_0-\theta_*}{\hat{se}}-z_{\alpha/2}\bigg)$$
My Attempt to prove the theorem
Suppose the true value of $\theta$ is $\theta_*$ and I want to know
$$\beta(\theta_*)=\mathbb P_{\theta_*}\bigg(\frac{|\theta_*-\theta_0|}{\hat{se}}>z_{\alpha/2}\bigg)=\mathbb P_{\theta_*}\bigg(\frac{\theta_*-\theta_0}{\hat{se}}>z_{\alpha/2}\bigg)+\mathbb P_{\theta_*}\bigg(\frac{\theta_*-\theta_0}{\hat{se}}<-z_{\alpha/2}\bigg)$$
How can I proceed from there? am I right so far?
A definition you may need:
If $\mathcal F = \{ f(x; \theta) : \theta \in \Theta \}$ is a parametric model, we write $P_{\theta}(X\in A) = \int_A f(x; \theta)dx$. The subscript $\theta$ indicates that the probability is with respect to $f(x;\theta)$.