Say I have $n$ time series $y_1, y_2,...y_n$. Each $y_i$ being a vector of size $m$.
Given $y_i$ and $y_j$ the correlation $\rho(y_i,y_j)$ can be computed.
Say I want to scale the correlation up or down, that is change $\rho(y_i,y_j)$ to $\rho(y_i,y_j) + 10\% of \rho(y_i,y_j)$ (i.e change the current level of correlation to 10 percent more than what it is. ).
My question: Is there a way to achieve this by scaling the time series $y_i$ and $y_j$ ?
Meaning transform $f: y_i-> z_i$ for all i. such that
$\rho(z_i,z_j) = \rho(y_i,y_j) + 10\% of \rho(y_i,y_j)$
I think, This can be done by knowing the approximate function of dependent and independent variable by Taylor series or something like this then transform the data
Generally linear correlation not work well so try non linear regression (the above suggestion is based on this)