If we have a $T\times r$ matrix $A$ and we know $$ A'A = T I_r$$
How can we deduce that $$ \frac{1}{T} \sum_{t=1}^{T} \lVert A_t \rVert^2 = O_p(1)$$ where $A_t$ is the transpose of the $t$-th row of $A$. Especially why it is bounded in probability.
My solution is \begin{align} \frac{1}{T} \sum \lVert{A_t}\rVert^2 &= \frac{1}{T} \sum tr{(A_t A_t')} \\ &= \frac{1}{T} tr (\sum A_t A_t') \\ &=\frac{1}{T} Tr = r \end{align}
Then it's $O(1)$ and also $O_p (1)$. Is it correct?
The claim is made on page 213 of "Determining the Number of Factors in Approximate Factor Models" (https://onlinelibrary.wiley.com/doi/pdf/10.1111/1468-0262.00273)