Let's consider two variables $X, Y$ such that $EX = EY = 0$, $VarX = VarY = 1$
Moreover, when considering OLS model without intercept $Y \sim X$ we obtained parameter $\beta$.
Thing that I want to investigate is connection between parameter $\beta$ and parameter $\hat \beta$ obtained from regression $X \sim Y$.
My solution
If $\beta$ and $\hat \beta$ are solutions of OLS problems then we have that:
$$\beta = (X^TX)^{-1}X^TY$$
$$\hat \beta = (Y^TY)^{-1}Y^TX$$
Then we have that:
$$\beta \hat \beta = (X^TX)^{-1}X^TY(Y^TY)^{-1}Y^TX = X^{-1}X^TX^TYY^{-1}Y^TX = 1$$
So we have that $\hat \beta = \frac{1}{\beta}$.
And I'm not sure if my solution is correct since, I check it in R to obtain that:
y <- rnorm(1000000)
k <- rnorm(1000000)
lm(y~ 0 + k)$coefficients == 1 / lm(k~0 + y)$coefficients
FALSE
Could you please tell me where do I have an error?
EDIT
As I understood problem in my calculations is that I'm trying to invert vectors, which are invertible. So my another idea driven by @GoldenRatio is that we know that:
$$\beta = \frac{\textrm{cov}(X, Y)}{\textrm{Var}X} = E[XY]$$
as well as
$$\hat \beta = \frac{\textrm{cov}(X, Y)}{\textrm{Var}Y} = E[XY]$$
So out of it we would have that $\beta = \hat \beta$. Is it right?
Your calculation doesn't make sense for a few reasons, one of which is the fact that a $n\times 1$ vector is not invertible ($n>1$). Assuming for simplicity that each regression uses a single regressor and no intercept (so that what you are calling $\beta,\hat\beta$ are scalars), we can't say how the objects
$$\frac{\frac{1}{n}\sum_{i}X_iY_i}{\frac{1}{n}\sum_i X_i^2},\frac{\frac{1}{n}\sum_{i}X_iY_i}{\frac{1}{n}\sum_i Y_i^2}.$$ compare since they are defined at the sample level; the numerators are the same, but we can't say how the denominators compare.
However, assuming $(X_i,Y_i)$ iid, and $\text{Var}(X_i)=\text{Var}(Y_i)=1,E[X_i]=E[Y_i]=0$, WLLN tells us the above objects have the same limit in probability, namely $$\frac{E[X_iY_i]}{E[X_i^2]}=\frac{E[X_iY_i]}{E[Y_i^2]}=\text{Cov}(X_i,Y_i).$$