Let $f_n$ be a sequence of Schwartz functions on $\mathbb{R}^N$ converging to another Schwartz function in the Schwartz space $\mathcal{S}(\mathbb{R}^N)$.
Now, let $A : \mathbb{R}^N \to \mathbb{R}^N$ be any invertible matrix. Then, functions $f_{A,n}$ and $f_A$ defined by \begin{equation} f_{A,n}(x):= f_n(Ax) \qquad f_A(x):=f(Ax) \end{equation} are again Schwartz functions on $\mathbb{R}^N$.
Now, my question is
Is it true that $f_{A,n} \to f_A$ in $\mathcal{S}(\mathbb{R}^N)$?
I think this is very plausible, but cannot justify rigorously. Could anyone please help me?
Add 1. Let $lVert f \rVert_{\alpha, \beta}:= \sup_{x \in \mathbb{R}^N} \lvert x^{\alpha} (D^{\beta} f)(x) \rvert$ for any multi-indices $\alpha$ and $\beta$. Then, $f_n \to f$ in the Scwhartz space means that $\lVert f_n-f \rVert_{\alpha, \beta} \to 0$ for all $\alpha, \beta$.
Add 2. I am aware that $\widehat{f_A}(\xi) = \frac{1}{\lvert \det A \rvert} \widehat{f}(A^{-T} \xi)$ where $\widehat{f}$ means the Fourier transform. Also, I know that the Fourier transform is a continuous linear bijection on the Scwhartz space whose inverse is also continuous. However, this does not seem to resolve my main issue because $A^{-T}$ persists in $\widehat{f}$.
Add 3. Let $A= (A^i_j)$ and $(Ax)_j=A^{i}_j x_i$ in the components w.r.t the standard basis of $\mathbb{R}^N$, where $i,j=1, \cdots, N$ and I have used the summation convention. Then any partial derivative $(D^{\alpha} f_{A}(x)$ is written as a sum of homogeneous polynomials in $A^i_j$ of order $\lvert \alpha \rvert$. I am stuck at how to use invertibility of $A$ to show that $\lVert f_{A,n}-f_A \rVert_{\alpha, \beta} \to 0$ if $\lVert f_n-f \rVert_{\alpha, \beta} \to 0$. I also guess there must be some more elegant way to prove this and the Fourier transform looked like a good candidate. But, as in the previous item, I still run into the problem of dealing with $A^{-T}\xi$.