Let $X:=[x_1\dots x_n], Y:=[y_1\dots y_n] \in \mathbb{R}^{d\times n}, x_i, y_i \in \mathbb{R}^{d\times 1}$.
Assume $X,Y$ are centered, i.e. each of their $d$ row sums are zero. Mathematically, $X(I_n - \frac{1}{n}1_n1_n')= X, 1_n$ denoting the $n$ -dim column vectors of all one's. Let $X'X = Y'Y \in \mathbb{R}^{n\times n}$. Does this necessarily mean: $Y=RX$ for some rotation $R \in O(d)?$
The centrering above is not much of a big deal: if we don't assule centered, my question is: does there exist $R \in O(d), v \in \mathbb{R}^d$ so that $y_i = Rx_i + v$?
Let $X$ have the singular value decomposition $X = U \Sigma V'$ where $U$ and $V$ are orthogonal matrices of sizes $d \times d$ and $n \times n$ respectively, and $\Sigma$ is a $d \times n$ diagonal matrix. Since $Y' Y = X' X = V \Sigma' \Sigma V'$, we can write the singular value decomposition of $Y$ as $\tilde{U} \Sigma V'$ for some orthogonal $\tilde{U}$, with the same $\Sigma$ and $V$. Thus $Y = \tilde{U} U^{-1} X = R X$ where $R = \tilde{U} U^{-1}$ is orthogonal.
EDIT: The SVD is not unique, but in its construction $\Sigma$ and $V$ are obtained from the eigenvalues and eigenvectors of $X' X$ respectively. So if $X' X$ and $Y' Y$ are equal, you can use the same $\Sigma$ and $V$ for both.