Let $A = (a_{i,j})$ be a constant $n \times n$ matrix. Identify points $ (\vec{x}_1 , \dots, \vec{x}_n) \in R^{n^2}$, $\vec{x}_j =(x_{1,j}, \dots, x_{n,j}) \in \mathbb{R}^n$, $1 \le j \le n$, with $n \times n$ matrices by arranging the vectors $\vec{x}_j$ into columns: $$\mathbb{R}^{n^2} \ni (\vec{x}_1 , \dots, \vec{x}_n) = \begin{bmatrix} x_{1,1} & \cdots & x_{1,n} \\ \vdots & \cdots & \vdots \\ x_{n,1} & \cdots & x_{n,n} \end{bmatrix} \equiv X. $$
Define the smooth map $F: \mathbb{R}^{n^2} \to \mathbb{R}^{n^2}$ by $$F(\vec{x}_1 , \dots, \vec{x}_n) = X^T A X,$$ where $X^T$ denotes matrix transpose.
I would like to find the $n^2 \times n^2$ Jacobian matrix $JacF(X)$ of the map $F$ at the point $X = I$, where $I$ is the $n \times n$ identity matrix. In particular, I would like to know if $\det(A) \neq 0$ implies that the Jacobian matrix at $I$ is also nonsingular.
If we label the component functions of $F$ as $F = (F_{1,1} \dots F_{n,1}, \dots , F_{1,n}, \dots, F_{n,n})$, I have determined that $$F_{i,j} = \sum_{k,l =1}^n a_{k,l}x_{k,i}x_{l,j}. $$
But the calculation appears to become quite tedious from here, since for each $F_{i,j}$ we need to calculated $n^2$ total partial derivatives.
Is there a more efficient way to determine whether $\det(JacF(I)) \neq 0$, perhaps by using total derivatives or the chain rule in some fashion? Hints or answers are greatly appreciated!
Since $F(X+H)-F(X)=H^TAX+X^TAH+O(\|H\|^2)$, the derivative of $F$ at $X$ is the linear map $H\mapsto H^TAX+X^TAH$. Hence the Jacobian matrix of $F$ at $X$ is $((AX)^T\otimes I_n)K^{(n,n)}+I_n\otimes(X^TA)$, where $K^{(n,n)}$ denotes the $n^2\times n^2$ commutation matrix.
At $X=I$, the derivative is thus $H\mapsto H^TA+AH$, which might be singular. E.g. we have $H^TA+AH=0$ when $H=\pmatrix{1&0\\ 0&-1}$ and $A=\pmatrix{0&1\\ 1&0}$. The Jacobian matrix in this example is $$ \pmatrix{0&2&0&0\\ 1&0&0&1\\ 1&0&0&1\\ 0&0&2&0}, $$ which has $\operatorname{vec}(H)=(1,0,0,-1)^T$ in its null space.