I understand the intuition that it is difficult to distinguish the effect of two independent variables on the dependent variable when they are highly correlated, but I don't see how this works its way through the formula of the variance of the regression coefficients below?
$$\sigma^2(X'X)^{-1}$$
I found the below image from here, which shows that variance of the coefficients, on the diagonals, are functions of their corresponding variables and the inverse of the determinant. I tried to do something similar with 2 independent variables (a 3x3 matrix) but couldn't really get anywhere that showed the relationship between collinearity and the standard error.
Additionally, given $(X'X)^{-1}$ is in the formula for the ols coefficients, and the determinant decreases with collinearity, wouldn't the fact that we scale by the inverse of the determinant mean that collinearity increases the size of the coefficients? This seems unintuitive.
