For an skew-symmetric matrix $A$ (meaning $A^T=-A$), the Pfaffian is defined by the equation $(\text{Pf}\,A)^2=\det A$. It is my understanding that this is defined for anti-symmetric matrices because it is known that the determinant of an anti-symmetric matrix is always a square of a polynomial in the entries of the matrix.
Now, skew-symmetry is sufficient to prove that the determinant is a square of a polynomial, but it is not necessary. The simplest example is the $2n\times 2n$ matrix $A=a I_{2n}$ with $a\in\mathbb{C}$ and $I_k$ the $k\times k$ identity matrix. The determinant is $\det A = a^{2n} = (a^n)^2$. Of course, for $a\neq 0$, $A$ is not skew-symmetric.
I have a few questions about this.
- Is there a generalization of a Pfaffian for any matrix whose determinant is a square of a polynomial?
- Is there a characterization (or some known set of properties) of matrices whose determinants are squares of polynomials?
- (Edit) Are there any known necessary and sufficient conditions for a matrix to have its determinant be the square of a polynomial (aside from skew-symmetry being sufficient)?
(Edit 2) For those who are curious, these questions arise from a problem from physics I am working on. I have a certain class of matrices whose characteristic polynomials (which arise as the determinant of a non-skew-symmetric matrix) appear to be the squares of Chebyshev polynomials. If I could prove that these characteristic polynomials must be squares of polynomials (using properties of the matrix) then I may be able to use some of the properties attributed to Pfaffians (or the proper generalization to non-skew-symmetric matrices) to confirm that they are indeed squared Chebyshev polynomials.
(Edit 3) To be as concrete as possible, I am looking for any information (e.g., answers to questions 1-3) on the set $$\{A\in\mathcal{M}_n(\mathbb{C}): \det A = p(\{a_{ij}\})^2\text{ with }p\text{ a polynomial} \}$$ where $\mathcal{M}_n(\mathbb{C})$ is the set of $n\times n$ complex matrices and $a_{ij}$ is the $i,j$'th entry of $A$.
Only a partial answer. The problem with defining the pfaffian of a matrix whose determinant is the square of the polynomial is that the sign of the pfaffian may be not well defined. For example, one may have two matrices $A$ and $B$ such that $det A=det B=(polynomial)^2$ but $pf A=-pf B$. A possible approach is to think in terms of unitary transformations and equivalence classes.
Let $A$ be a $2n\times 2n$ matric, not necessarily antisymmetric. Let $\mathcal U(A)$ be the set of all matrices $B=U A U^\dagger$ unitarily equivalent to $A$. Now consider the subset $\bar{\mathcal U}(A)\subset\mathcal U(A)$ of matrices which are antisymmetric. It is clear that all matrices in ${\mathcal U}$ have the same determinant, and all matrices in $\bar{\mathcal U}$ have the same pfaffian. Therefore one can define the pfaffian of any $A\in{\mathcal U}$ as the pfaffian of any $B\in\bar{\mathcal U}$. In short, one can define
$$pf A=pf (U A U^\dagger)$$
if there exists a unitary matrix $U$ such that $U A U^\dagger$ is antisymmetric. (The unitary matrix does not need to be uniquely defined, of course).
Maybe this definition is a little tautological but it makes sense from the point of view of physics, because unitary matrices are associated with (unitary) symmetries and therefore do not affect the value of measurable quantities. For example, the determinant of a Hamiltonian is an invariant quantity up to unitary symmetries. With the definition above, the pfaffian of a Hamiltonian has the same property.