Suppose that you observe $(X_1,Y_1),...,(X_{100}Y_{100})$, which you assume to be i.i.d. copies of a random pair $(X,Y)$ taking values in $\mathbb{R}^2 \times \{1,2\}$.
I have that the cost of misclassification are equal, $c_1=c_2=1$.
The distributions $X|Y=1$ and $X|Y=2$ are each rotationally symmetric.
I would like to perform LDA to classify the data points.
I have that $a=\hat\Sigma^{-1}_p(\hat{\mu}_1-\hat{\mu}_2)=(0.132-0.0732)$ and $\frac{1}{2}a^T(\hat{\mu}_1+\hat{\mu}_2)\approx 0$
And now I would like to calculate the approximate expected cost of using LDA.
So, in the textbook i'm using, the expected cost of misclassification is defined as:
Suppose we use the classification rule $g:\mathbb{R}^p\rightarrow \{1,2\}$, that assigns to group $1$ when $x \in R_1$ adnto group $2$ when $x\in R_2$. The expected cost of misclassification associated to the rule $g$ is $$\mathbb{E}[\text{cost}(Y,g(X))]=c_2\mathbb{P}(x\in R_1 | Y=2)\pi_2+c_1\mathbb{P}(x\in R_2 | Y=1)\pi_1$$ Where $\pi_1=\mathbb{P}(Y=1|x)$ and $\pi_2=\mathbb{P}(Y=2|x)$
And so my attempt is:
$$\mathbb{E}[\text{cost}(Y,g(X))]=c_2\mathbb{P}(x\in R_1 | Y=2)\pi_2+c_1\mathbb{P}(x\in R_2 | Y=1)\pi_1=\mathbb{P}(0.132x_1-0.0732x_2\gt 0|Y=2)\pi_2+\mathbb{P}(0.0132x_1-0.0722x_2 \lt 0|Y=1)\pi_1$$
And i'm stuck here. I'm not sure how to continue from here. In particular, I don't know what $\mathbb{P}(0.0132x_1-0.0722x_2 \lt 0|Y=1)$ and $\mathbb{P}(0.132x_1-0.0732x_2\gt 0|Y=2)$ are equal to, also $\pi_1,\pi_2$.

Estimating this quantity is simple: $\pi_i=Pr(y=c_i)$ is just the prior that a point belongs to a certain class. For your finite dataset, you can estimate it via
$$\pi_i \approx \frac{1}{N}\sum_{j=1}^N [y_j=c_i] = n_i/N$$
Secondly, $Pr(a^Tx>0\mid y=x_i)$ is the probability that a datapoint gets correctly classified, which you can estimate as
$$Pr(a^Tx>0\mid y=c_i)\approx\frac{1}{n_i}\sum_{j=1}^N [y_j = c_i]\cdot [a^Tx_j>0]$$
However, to get a good estimate, you should compute this on a hold-out test set or by leave-one-out cross validation.
Now if you want to analytically compute this quantity, I am not sure if we have enough information about the distributions $p(x\mid y=c_i)$. However, we can deduce quite a few things from the two clues
The first one implies that $p(x\mid y=c_i)=g_i(\|x-\mu_i\|_2)$ for some $g_i\colon\mathbb R_{\ge 0} \to\mathbb R_{\ge 0}$. Moreover, due to the rotational symmetries I think we must have $E[(x-\mu_i)(x-\mu_i)^T]=\Sigma_i = \alpha_i I$ for some $\alpha_i>0$, and consequently LDA will estimate the covariance as $\Sigma=\pi_1 \Sigma_1 + \pi_2\Sigma_2 = (\alpha_1\pi_1 + \alpha_2\pi_2)I$. Thus the second clue suggests that
$$\begin{aligned} &&\frac{1}{2}a^T(\hat\mu_1 +\hat\mu_2) &\approx 0 \\&\iff&(\hat\mu_1 -\hat\mu_2)^T\hat\Sigma^{-1}(\hat\mu_1 +\hat\mu_2) &\approx 0 \\&\;\;\leftrightsquigarrow&(\mu_1-\mu_2)^T\Sigma^{-1}(\mu_1+\mu_2) &=0 \\&\iff& \mu_1^T\Sigma^{-1}\mu_1 &= \mu_2^T\Sigma^{-1}\mu_2 \\&\iff& \frac{\mu_1^T\mu_1}{\alpha_1\pi_1 + \alpha_2\pi_2} &= \frac{\mu_2^T\mu_2}{\alpha_1\pi_1 + \alpha_2\pi_2} \\&\iff& \|\mu_1\|_2 &= \|\mu_2\|_2 \end{aligned}$$
In particular, after appropriate centering we have $\mu_1 = -\mu_2$. Now, if both $x\mid y=c_1$ and $x\mid y=c_2$, are normally distributed, we can analytically integrate and are done more of less. However, if they are not, I am not sure if there is a way to proceed.