I have two series of numbers that have certain correlation coefficient $\rho$.
How can I make a two series of random numbers that have correlation $\rho$, $\mu = 0$ and $\sigma = 1$?
I tried using cholesky decomposition, but the output numbers are scaled to the original series, not with $\sigma = 1$.
Code:
import numpy as np
a = np.random.normal(0.0003,0.0001, size=100)
b = np.random.normal(0.005,0.01, size=100)
cov = np.cov(a,b)
chol = np.linalg.cholesky(cov)
print(np.matmul(chol, np.random.normal(0,1,size=(2,10)))[0].std())
print(np.matmul(chol, np.random.normal(0,1,size=(2,10)))[1].std())
That code prints: $9.165613274294529e-05$ and $0.012372828396076391$.
Shouldn't the output of the matmul operation have the same $\sigma$ as the one passed as second parameter (np.random.normal(0,1,size=(2,10))) ?
For this purpose, Cholesky decomposition can use either the correlation or covariance matrix of the original numbers. To avoid scaling issues use correlation:
chol_mat = np.linalg.cholesky(np.corrcoef(a, b))