I conjecture that in $\mathbb R^n$, every symmetric matrix is a non-uniform stretch. Am I correct?
By non-uniform stretch, I mean that if $T$ is a non-uniform stretch, there exists an orthonormal basis $\{t_n\}$ of $\mathbb R^n$ such that for all $x, T(x) \cdot t_i = \lambda_i x \cdot t_i$. This includes collapse ($\lambda_i = 0$) and reflection ($\lambda_i < 0$).
Support for conjecture: First, consider symmetric matrices of the form $A^\top A$. These decompose into $A = U \Sigma V ^ \top$ and $A^\top = V \Sigma ^\top U^\top$, so $A ^ \top A = U \Sigma V ^ \top V \Sigma ^\top \Sigma U = U \Sigma ^2 U^{-1}$. Since $U$ is orthogonal, it is a rotation, and so we have a rotation, a non-uniform stretch along the standard basis, and rotation back, resulting in a non-uniform stretch along an arbitrary orthonormal basis. Is that correct?
What about symmetric matrices that cannot be expressed in $\mathbb R^n$ as $A^\top A$? What geometric form can they take?
Note: This post provides relevant background but does not directly address this question.
Update
Based on comments below, an alternate way of phrasing this question may be: Prove that the set of orthorgonally diagonizable matrices in $\mathbb R^n$ is precisely the set of non-uniform stretches in $\mathbb R^n$.
First: what you call a "non-uniform stretch" is what is usually called "orthogonally diagonalizable."
Recall that if $A$ is a square matrix, then an eigenvector of $A$ is a nonzero vector $\mathbf{x}$ such that there exists a scalar $\lambda$ with $A\mathbf{x}=\lambda\mathbf{x}$. An $n\times n$ matrix is diagonalizable if and only if there exists a basis $\mathbf{v}_1,\ldots,\mathbf{v}_n$ such that each $\mathbf{v}_i$ is an eigenvector of $A$; and is orthogonally diagonalizable if there exists an orthonormal basis $\mathbf{v}_1,\ldots,\mathbf{v}_n$ such that each $\mathbf{v}_i$ is an eigenvector of $A$.
Moreover, if $[\mathbf{v}_1,\ldots,\mathbf{v}_n]$ is an orthonormal basis, then it is well known that for every vector $\mathbf{x}$ we have $$\mathbf{x} = (\mathbf{x}\cdot \mathbf{v}_1)\mathbf{v}_1 + \cdots + (\mathbf{x}\cdot \mathbf{v}_n)\mathbf{v}_n,$$ where $\cdot$ is the standard inner product (you can replace the dot product with an arbitrary inner product $\langle -,-\rangle$).
You are requiring that $A\mathbf{x}\cdot t_i = \lambda_i\mathbf{x}\cdot t_i$. Putting $\mathbf{x}=t_j$ we obtain $$\begin{align*} At_j &= \sum_{i=1}^n (At_j\cdot t_i)t_i\\ &= \sum_{i=1}^n (\lambda_i t_i\cdot t_j)t_i\\ &= \lambda_jt_j, \end{align*}$$ so each $t_i$ is an eigenvector of $A$. Conversely, if each $t_i$ is an eigenvector corresponding to $\lambda_i$, then we have for every vector $\mathbf{x}$, $$\begin{align*} A\mathbf{x}\cdot t_j &= \left( A\left(\sum_{i=1}^n (\mathbf{x}\cdot t_i)t_i\right)\right)\cdot t_j\\ &= \left( \sum_{i=1}^n A(\mathbf{x}\cdot t_i)t_i\right)\cdot t_j\\ &= \sum_{i=1}^n (\mathbf{x}\cdot t_i)(At_i\cdot t_j)\\ &= \sum_{i=1}^n (\mathbf{x}\cdot t_i) (\lambda_i t_i\cdot t_j)\\ &= (\mathbf{x}\cdot t_i)\lambda_i\\ &= \lambda_i \mathbf{x}\cdot t_i, \end{align*}$$ satisfying your requirements.
So your question really amounts to:
The "over $\mathbb{R}$" clause is to ensure that all $\lambda_i$ are real numbers, and all vectors have real coordinates. It is "easier" to orthogonally diagonalize a matrix over $\mathbb{C}$ (a matrix $A$ with complex coefficients is orthogonally diagonalizable over $\mathbb{C}$ if and only if $AA^*=A^*A$, where $A^*$ is the conjugate transpose), than over $\mathbb{R}$.
The answer is "yes", as is classically known:
We need some lemmas.
Lemma. Let $A$ be an $n\times n$ matrix with real coefficients. Then the unique $n\times n$ matrix such that for all vectors $\mathbf{x}$ and $\mathbf{y}$ in $\mathbb{R}^n$, $(A\mathbf{x}) \cdot y = \mathbf{x}\cdot(B\mathbf{y})$ is $B=A^T$, the transpose of $A$.
Proof. Writing vectors as columns, we have that $\mathbf{x}\cdot\mathbf{y}=\mathbf{y}^T\mathbf{x}$ (matrix product, interpreting a $1\times 1$ matrix as a scalar).
Thus, we have $$(A\mathbf{x})\cdot \mathbf{y} = \mathbf{y}^T(A\mathbf{x}) = (\mathbf{y}^TA)\mathbf{x} = (A^T\mathbf{y})^T\mathbf{x} = \mathbf{x}\cdot(A^T\mathbf{y}).$$ So $A^T$ satisfies the equation. Conversely, if $B$ also satisfies the equation, then for all $\mathbf{x}$ and $\mathbf{y}$ we have $$\mathbf{x}\cdot(A^T\mathbf{y}) - \mathbf{x}\cdot(B\mathbf{y}) = \mathbf{x}\cdot(A^T-B)\mathbf{y} = 0,$$ so $(A^T-B)\mathbf{y}=\mathbf{0}$ for all $\mathbf{y}$, hence $A^T=B$. $\Box$
Over $\mathbb{C}$, the inner product is given by $\mathbf{y}^*\mathbf{x}$, where $A^*$ is the conjugate tranpose of the matrix $A$, and the unique matrix $B$ such that $A\mathbf{x}\cdot\mathbf{y} = \mathbf{x}\cdot B\mathbf{y}$ is $B=A^*$.
Lemma 2. Let $A$ be an $n\times n$ matrix with complex coefficients such that $AA^*=A^*A$. If $\mathbf{x}$ is an eigenvector of $A$ corresponding to $\lambda$, then $\mathbf{x}$ is an eigenvector of $A^*$ corresponding to $\overline{\lambda}$. In particular, if $A=A^*$, then $\lambda$ is a real number.
Proof. Note that $$\lVert A\mathbf{x}\rVert^2 = (A\mathbf{x})\cdot(A\mathbf{x}) = \mathbf{x}\cdot (A^*A)\mathbf{x} = \mathbf{x}\cdot (AA^*)\mathbf{x} = (A^*\mathbf{x})\cdot(A^*\mathbf{x}) = \lVert A^*\mathbf{x}\rVert^2.$$
Since $(A-\lambda I)$ commutes with $A^*-\overline{\lambda}I - (A-\lambda I)^*$ when $A$ commutes with $A^*$, we have $$0 = \lVert (A-\lambda I)\mathbf{x}\rVert^2 = \lVert (A^*-\overline{\lambda}I)\mathbf{x}\rVert^2.$$ So $A^*\mathbf{x}=\overline{\lambda}\mathbf{x}$. When $A=A^*$, we conclude that $\mathbf{x}$ is an eigenvector of both $\lambda$ and $\overline{\lambda}$, so $\lambda=\overline{\lambda}$ is a real number. $\Box$
Note that the characteristic polynomial of a symmetric matrix with real coefficients must therefore split over $\mathbb{R}$: for it splits over $\mathbb{C}$ and its roots are the eigenvalues of $A$, but by Lemma 2 those eigenvalues are real numbers.
Now we can apply Schur's Lemma:
Schur's Lemma. Let $A$ be an $n\times n$ real matrix whose characteristic polynomial splits. Then there is an orthonormal basis $\beta$ of $\mathbb{R}^n$ such that the coordinate matrix $[L_A]_{\beta}$ of multiplication by $A$ relative to the basis $\beta$ is upper triangular.
Proof. Because the characteristic polynomial splits, we know there is a basis $\gamma=[\mathbf{v}_1,\ldots,\mathbf{v}_n]$ such that $[L_A]_{\gamma}$ is upper triangular (e.g., a Jordan Canonical basis); that is $A\mathbf{v}_i\in\mathrm{span}(\mathbf{v}_1,\ldots,\mathbf{v}_i)$ for $i=1,\ldots,n$.
We apply Gram-Schmidt to the basis $\gamma=[\mathbf{v}_1,\ldots,\mathbf{v}_n]$ to obtain an orthonormal basis $\beta=[\mathbf{u}_1,\ldots,\mathbf{u}_n]$, such that $\mathrm{span}(\mathbf{v}_1,\ldots,\mathbf{v}_i) = \mathrm{span}(\mathbf{u}_1,\ldots,\mathbf{u}_i)$ for each $i$. It follows that $[L_A]_{\beta}$ is upper triangular, as desired. $\Box$
Theorem (sometimes called the Spectral Theorem). Let $A$ be an $n\times n$ matrix with real coefficients. Then $A$ is orthogonally diagonalizable over $\mathbb{R}$ if and only if $A$ is symmetric.
Proof. Assume first that $A$ is orthogonally diagonalizable, and let $[\mathbf{v}_1,\ldots,\mathbf{v}_n]$ be an orthonormal basis consisting of eigenvectors of $A$ corresponding to the real numbers $\lambda_1,\ldots,\lambda_n$. Let $D$ be the diagonal matrix with entries $\lambda_1,\ldots,\lambda_n$, and let $V$ be the matrix whose columns are the vectors $\mathbf{v}_1,\ldots,\mathbf{v}_n$. Then we have that $V^{-1}AV = D$, or equivalently (since the columns of $V$ are orthonormal) that $$A = VDV^T.$$ We prove that $A$ is symmetric. The $(i,k)$ entry of $VD$ is $\lambda_kv_{ik}$, where $v_{ik}$ is the $(i,k)$ entry of $V$, and $v_{kj}^T$ the $(k,j)$ entry of $V^T$ (which equals $v_{jk}$). So the $(i,j)$ entry of $A$ is $$a_{ij}=\sum_{k=1}^n \lambda_kv_{ik}v_{kj}^T = \sum_{k=1}^n \lambda_kv_{ik}v_{jk}.$$ On the other hand, the $(j,i)$ entry is $$a_{ji}=\sum_{k=1}^n \lambda_kv_{jk}v_{ki}^T = \sum_{k=1}^n \lambda_k v_{jk}v_{ik} = a_{ij}.$$ So $A$ is symmetric.
Conversely, assume that $A$ is symmetric. Then its characteristic polynomia splits, so there is an orthonormal basis $\beta=[\mathbf{v}_1,\ldots,\mathbf{v}_n]$ such that $A\mathbf{v}_i\in\mathrm{span}(\mathbf{v}_1,\ldots,\mathbf{v}_i)$ for $i=1,\ldots,n$.
Let $V$ be the matrix whose columns are the vectors in $\beta$. Then $A=VUV^T$, where $U$ is an upper triangular matrix. Then $$A = A^T = (VUV^T)^T = VU^TV^T.$$ But that means that $VUV^T = VU^TV^T$; since $V$ is invertible, it follows that $U=U^T$. But $U$ is upper triangular, and $U^T$ is lower triangular. Therefore, $U$ is actually diagonal, so that each of $\mathbf{v}_1,\ldots,\mathbf{v}_n$ is an eigenvector of $A$. Thus, there is an orthonormal basis consisting of eigenvectors of $A$, as desired. $\Box$