The reason the law of large numbers fails to apply for the Cauchy distribution is that the distribution of $X_1+X_2+X_3+\dots X_n$ is the same as $X_1+X_1+\dots X_1 = n X_1$. This is billed a curious property by Feller on page 51 of his book: An introduction to probability theory and its applications, volume 2 first edition. In the first case, we're adding $n$ independent random variables and in the second one, we're adding $n$ perfectly dependent random variables with correlation as high as it can get (they're all literally the same variable). And yet, we end up with the same resulting distribution.
EDIT: The argument regarding the variances being different is resolved because they blow up. The question that remains is that in the title. We know adding $n$ perfectly dependent Cauchy's is equivalent to the first Cauchy scaled by $n$. And adding $n$ perfectly independent Cauchy's is also the first Cauchy scaled by $n$. What happens if we add $n$ Cauchy's that are correlated but not perfectly? Do we still get the first Cauchy scaled by $n$?
The argument regarding variances that was addressed
But thinking about the variance of the two sums, this doesn't seem to add up. When the variables are independent we have:
$$Var(X_1+X_2+\dots X_n) = n Var(X_1) \tag{1}$$
And for the second sum we have:
$$Var(X_1+X_1+ \dots X_1) = Var(n X_1) = n^2 Var(X_1) \tag{2}$$
It's clear the second sum has a higher variance due to the covariance terms which are not there in the first case. But if the two have difference variances, then how can they have the same distribution? And if they somehow do have the same variance, what happens when the covariance between the random variables is somewhere between complete independence and the variables being replicas of the same thing?