Show that if $u$ is harmonic in $\mathbb{R}^n$ and if $T:\mathbb{R}^n\to\mathbb{R}^n$ is an orthogonal transformation then $u\circ T$ is also harmonic in $\mathbb{R}^n$. Remember that an linear transformation $T:\mathbb{R}^n\to\mathbb{R}^n$ is orthogonal if $(^tT)T = T(^tT) = I$, the identity
What is $(^tT)T$? I think it's the orthogonal of $T$ multiplied with $T$.
Well, I'm looking into $p(x_1,\cdots,x_n) = (u\circ T)(x_1,\cdots,x_n)$
Here's what I did:
$$\frac{\partial p}{\partial x_j}(x) = \sum_{i=1}^n\frac{\partial u}{\partial u_i}(T(x))\frac{\partial u_i}{\partial x_j}(x)$$
where $u_i = T_i$ because $u = (u_1,\cdots,u_n)$, I guess. (is there a better way to represent this)?
Then if we take the second derivative again (because we need the laplace operator)
$$\frac{\partial^2 p}{\partial x_j^2}(x) = \sum_{i=1}^n\left(\sum_{k=1}^n\frac{\partial^2 u}{\partial u_k\partial u_i}(T(x))\frac{\partial u_i}{\partial x_j}(x) + \frac{\partial u}{\partial u_i}(T(x))\frac{\partial^2 u_i}{\partial x_j^2}(x)\right)$$
I need to sum this over $j=1,\cdots,n$ and prove it's $0$. I can also use $$\sum_{l=1}^n\frac{\partial^2 u}{\partial x_l^2} = 0$$
because $u$ is harmonic by hypothesis. Therefore when I sum over $j$ I can use the fact that $\sum_{i=1}^n\frac{\partial u}{\partial u_i}(T(x))\frac{\partial^2 u_i}{\partial x_j^2}(x)$ does not depend on $j$ and therefore
$$\sum_{j=1}^n\sum_{i=1}^n \frac{\partial u}{\partial u_i}(T(x))\frac{\partial^2 u_i}{\partial x_j^2}(x) = 0$$
so I end up with
$$\Delta p = \sum_{j=1}^n\frac{\partial^2 p}{\partial x_j^2}(x) = \sum_{j=1}^n\sum_{i=1}^n\left(\sum_{k=1}^n\frac{\partial^2 u}{\partial u_k\partial u_i}(T(x))\frac{\partial u_i}{\partial x_j}(x)\right)$$
I guess now I have to use the fact the the transformation is orthogonal. I think that $(^tT)T$ should play a role here but I can't even see the elements of $T$ in the sum I've ended with. Maybe it's not the orthogonal matrix so that's why I have no idea of what to do now.
UPDATE:
If I remember that $u_i = T_i$ and imagine $T_i$ as a matrix with coefficients $t_{ij}$ we have that $\frac{\partial u_i}{\partial x_j} = \frac{\partial T_i}{\partial x_j} = t_{ij}$ so we end up with:
$$\Delta p = \sum_{j=1}^n\frac{\partial^2 p}{\partial x_j^2}(x) = \sum_{j=1}^n\sum_{i=1}^n\left(\sum_{k=1}^n\frac{\partial^2 u}{\partial u_k\partial u_i}(T(x))t_{ij}\right)$$
In the PDF the user linked below, it takes the derivative with respect to the ith term and then with respect to the jth term. However I should take two times the derivative with respect to the ith, not mixed partials, because the divergence does not mix.
UPDATE 2:
Turns out I should have considered $T$ as beig linear from the beggining, I wouldn't need a product rule. And I also forgot The outer derivative when I did the second partial derivative, so it should be:
$$\frac{\partial^2 p}{\partial x_j^2}(x) = \sum_{i=1}^n\left(\sum_{k=1}^n\frac{\partial^2 u}{\partial u_k\partial u_i}(T(x))\frac{\partial u_i}{\partial x_j}(x)\frac{\partial u_i}{\partial x_j}(x)\right) = \sum_{i=1}^n\left(\sum_{k=1}^n\frac{\partial^2 u}{\partial u_k\partial u_i}(T(x))t_{ij}t_{ij}\right)$$
but this doesn't seem right