I read the following on the AP statistics website.
There’s a nice geometric model that represents random variables as vectors whose lengths correspond to their standard deviations. When the variables are independent, the vectors are orthogonal. Then the standard deviation of the sum or difference of the variables is the hypotenuse of a right triangle.
It doesn't elaborate further - it just points this out as a neat fact. How can we represent a random variable as a vector?
I assume this is not referring to multivariate random variables (i.e. a vector of random variables), but to a single random variable. I assume this because the length of a multivariate random variable does not (to my knowledge) represent a standard deviation.
The AP folks might be thinking along these lines:
Real-valued random variables are real-valued $\mathcal S$-measurable functions on a probability space $(\Omega,\mathcal S, P)$. Those with finite variances are elements of $L^2(\Omega)$, the square integrable functions. (Functions on $\Omega$, integrable with respect to the measure $P$.) This is an inner product space, with inner product $\langle X,Y\rangle = E XY = \int_\Omega X(\omega)Y(\omega)P(d\omega)$. Consider the mean centering map $c: X\mapsto X-EX$. It is the orthogonal projection onto the space of square-integrable mean $0$ random variables. Then $\|c(X)\|^2=\langle c(X),c(X)\rangle$ equals the variance of $X$. If $X$ and $Y$ are uncorrelated then $c(X)$ and $c(Y)$ are perpendicular.
So maybe $L^2(\Omega)$ is the vector space in question, and $c(X)$ is the vector corresponding to $X$.