A question says, a channel is corrupted by Additive White Gaussian Noise with zero mean and RMS voltage 20 nV. The probability that the noise voltage is less than a particular positive value c is 0.9. Find c.
I can calculate it using the following equation.
$$ 0.9 = \int_{-\infty}^{c} {\frac{1}{\sqrt{2\pi\sigma^2}}e^{-x^2/(2\sigma^2)} dx}$$
But I'm wondering what would be the value for ${\sigma^2}$ term? And I end up having infinity terms for both denominator and numerator.
Also I can use the standard normal distribution $\displaystyle {\frac{e^{-z^2/2}}{\sqrt{2\pi}}}$
and with that, by transforming $x$ to $z$ by $\displaystyle z = \frac{x}{\sigma}$, using normal tables, $P(z < k) = 0.9$ where $\displaystyle k = \frac{c}{\sigma}$
still I get a $\sigma$.
Can I have a little guidance to point me in the correct path? If the resistance is given, we can find the value of $\sigma^2$ but they haven't given any resistance.
I decided to move from a comment to an answer because some times I ended up using some facts without digging too much in the whys. So, I decided to dig a little more and hopefully answer the questions of the OP. This answer is mainly about why the RMS of the noise is equal to its standard deviation. As a side note, I will also do some comments about the resistence value.
An AWG noise is a random process $X(t)$ that has certain special properties, and one of them is that is ergodic. Here you would find a nice intuitive explanation about ergodicity that can somehow clarify why an AWG noise is an ergodic process. Roughly speaking, that property implies that the statistical (or ensemble) averages (those that you compute using the expected value operator $E[\cdot]$) are the same than the temporal averages (those that you compute by integrating over time and then dividing by the integration time). Below we will see specifically what this mean.
The autocorrelation function of a random process (ensemble average)
The autocorrelation function of a random process $X(t)$ is defined as
$$R(\tau) = E[X(t)X(t+\tau)].$$
If $X(t)$ is zero-mean and we evaluate at $\tau = 0$, we find that
$$R(0) = \sigma^2 = E[X^2(t)]. \label{1}\tag{1}$$
The autocorrelation function of a signal (temporal average)
The autocorrelation function of a signal $X(t)$ is defined as
$$R(\tau) = \lim_{T\to\infty} \frac{1}{T} \int_{0}^{T} X(t)X(t+\tau)dt.$$
Denoting by $RMS(X)$ the RMS value of $X(t)$ and letting $\tau = 0$, we get
$$R(0) = RMS^2(X) = \lim_{T\to\infty} \frac{1}{T} \int_{0}^{T} X^2(t)dt \label{2}\tag{2},$$
For an ergodic random process $X(t)$, the ensemble average in \eqref{1} is equal to the temporal average in \eqref{2}! That is,
$$\sigma^2 = E[X^2(t)] = \lim_{T\to\infty} \frac{1}{T} \int_{0}^{T} X^2(t)dt = RMS^2(X)$$
What is the physical meaning of all this? Since $X(t)$ is being seen as a voltage signal, its RMS value is $RMS(X)$ as defined above. Although usually we may interpret $\sigma^2$ as a normalized average power consumed over a 1 Ohm resistence, it is strictly not a power. Take a look at the units: if $X(t)$'s units are Volts, then both $\sigma^2$ and $RMS^2(X)$ have units [Volts]$^2$, not Watts. Thus, the only way in which it would make sense in the problem to add a resistence (I would prefer to talk about impedance) value $R$ is when instead of the RMS value we are given the average power value, $P_{avg}$, because from both values you compute the RMS, that is
$$RMS^2 = P_{avg}R$$