Suppose I have Shannon's rate equation $$R = \sum_{i=1}^{N}\log\left(1+\frac{P\gamma_i}{\sigma^2}\right).$$ Here $N$ is the number of OFDMA subcarriers, and $\gamma_i$ are channel coefficients that are exponentially distributed random variables. $P$ and $\sigma^2$ are transmitted power and noise variance, respectively, which are constant terms.
Finally, we have to find the outage probability $Pr(R<t)$.
Here $t$ is some predefined threshold. Is there any way to solve this outage probability?