Inverting $T\left(x\right)=Ax+xA=Ax-\left({Ax}\right)^T$ where x and $T(x)$ are antisymmetric and A is symmetric.

63 Views Asked by At

I’m currently trying to find $T^{-1}\left(\widetilde{x}\right)$ so I can invert the linear transformation. Equivalently, solving $\widetilde{y}=\widetilde{A}\widetilde{x}+\widetilde{x}\widetilde{A}=\widetilde{A}\widetilde{x}-\left(\widetilde{A}\widetilde{x}\right)^T$ for $\widetilde{x}$. Does anyone know how this can be done while keeping $\widetilde{A}$ and $\widetilde{x}$ in matrix form? I know it can be done by arranging the upper off-diagonal components of $\widetilde{y}$ and $\widetilde{x}$ into 2 vectors of equal size. From there, it can be seen that $L\left(\widetilde{y}\right)=\widetilde{M}L\left(\widetilde{x}\right)$ where $\widetilde{M}$ can be derived by using $\widetilde{A}$. From there, one just does $\widetilde{x}=L^{-1}\left({\widetilde{M}}^{-1}L\left(\widetilde{y}\right)\right)$. This is illustrated below. $$\left[\begin{matrix}0&y_{12}&y_{13}\\-y_{12}&0&y_{23}\\-y_{13}&-y_{23}&0\\\end{matrix}\right]=\left[\begin{matrix}0&x_{12}&x_{13}\\-x_{12}&0&x_{23}\\-x_{13}&-x_{23}&0\\\end{matrix}\right]\left[\begin{matrix}A_{11}&A_{12}&A_{13}\\A_{12}&A_{22}&A_{23}\\A_{13}&A_{23}&A_{33}\\\end{matrix}\right]+\left[\begin{matrix}A_{11}&A_{12}&A_{13}\\A_{12}&A_{22}&A_{23}\\A_{13}&A_{23}&A_{33}\\\end{matrix}\right]\left[\begin{matrix}0&x_{12}&x_{13}\\-x_{12}&0&x_{23}\\-x_{13}&-x_{23}&0\\\end{matrix}\right]=\left[\begin{matrix}0&x_{12}\left(A_{11}+A_{22}\right)+x_{13}A_{23}-x_{23}A_{13}&x_{12}A_{23}+x_{13}\left(A_{11}+A_{33}\right)+x_{23}A_{12}\\-x_{12}\left(A_{11}+A_{22}\right)-x_{13}A_{23}+x_{23}A_{13}&0&-x_{12}A_{13}+x_{13}A_{12}+x_{23}\left(A_{22}+A_{33}\right)\\-x_{12}A_{23}-x_{13}\left(A_{11}+A_{33}\right)-x_{23}A_{12}&x_{12}A_{13}-x_{13}A_{12}-x_{23}\left(A_{22}+A_{33}\right)&0\\\end{matrix}\right]$$ $$\left[\begin{matrix}y_{23}\\y_{13}\\y_{12}\\\end{matrix}\right]=\left[\begin{matrix}x_{23}\left(A_{22}+A_{33}\right)+x_{13}A_{12}-x_{12}A_{13}\\x_{23}A_{12}+x_{13}\left(A_{11}+A_{33}\right)+x_{12}A_{23}\\-x_{23}A_{13}+x_{13}A_{23}+x_{12}\left(A_{11}+A_{22}\right)\\\end{matrix}\right]=\left[\begin{matrix}A_{22}+A_{33}&A_{12}&-A_{13}\\A_{12}&A_{11}+A_{33}&A_{23}\\-A_{13}&A_{23}&A_{11}+A_{22}\\\end{matrix}\right]\left[\begin{matrix}x_{23}\\x_{13}\\x_{12}\\\end{matrix}\right]$$ $$\left[\begin{matrix}x_{23}\\x_{13}\\x_{12}\\\end{matrix}\right]=\left[\begin{matrix}A_{22}+A_{33}&A_{12}&-A_{13}\\A_{12}&A_{11}+A_{33}&A_{23}\\-A_{13}&A_{23}&A_{11}+A_{22}\\\end{matrix}\right]^{-1}\left[\begin{matrix}y_{23}\\y_{13}\\y_{12}\\\end{matrix}\right]$$ The problem with this method is that it’s highly arbitrary and requires manual setup for each dimension. I’d like a nice compact notation way for calculating the inverse. So it should not include anything like $L\left(\widetilde{x}\right)$. Preferably one which only uses $\widetilde{A}$ and $\widetilde{y}$. Any help is appreciated.

1

There are 1 best solutions below

2
On

Note: I assume that all matrices have real entries.

Here's a method that doesn't quite keep everything in "matrix form", but with that said I've tried to keep things as compact as possible.

By the spectral theorem, there exists an orthonormal basis of eigenvectors $e_1,\dots,e_n$ of $A$, and let $\lambda_1,\dots,\lambda_n \in \Bbb R$ denote the associated eigenvalues. Let $E_{ij} = e_ie_j^T$. Let $J_{ij} = E_{ij} - E_{ji}$. The set $\mathcal B = \{J_{ij}: 1 \leq i < j \leq n\}$ forms a basis for the set of skew-symmetric matrices. Moreover, this basis is orthogonal relative to the Frobenius inner product.

We compute $$ AE_{ij} = Ae_i e_j^T = \lambda_i e_ie_j^T = \lambda_i E_{ij}. $$ It follows that $$ T(J_{ij}) = AJ_{ij} - (AJ_{ij})^T =(AE_{ij} - A E_{ji}) - (AE_{ij} - AE_{ji})^T \\ = (\lambda_i E_{ij} - \lambda_j E_{ji}) - (\lambda_i E_{ij} - \lambda_j E_{ji})^T \\ = (\lambda_i + \lambda_j)(E_{ij} - E_{ji}) \\= (\lambda_i + \lambda_j)J_{ij}. $$ That is, $J_{ij}$ is an eigen-"vector" (eigenmatrix if you prefer) of $T$. So, $T$ is diagonalizable and $\mathcal B$ is a complete set of associated eigenvectors. With that, we conclude that $T$ is invertible if and only if $A$ does not have a pair of eigenvectors of the form $\pm \lambda$, and $$ T^{-1}(J_{ij}) = (\lambda_i + \lambda_j)^{-1}J_{ij}. $$ Using the orthogonality of $\mathcal B$, we can compute $T^{-1}(X)$ for an arbtirary matrix $X$ as follows: $$ T^{-1}(X) = \frac 12\sum_{1 \leq i < j \leq n} (\lambda_i + \lambda_j)^{-1}\langle X,J_{ij}\rangle J_{ij} $$ where $\langle X,Y \rangle$ denotes the aforementioned Frobenius inner product. Using the fact that $\langle X,J_{ij}\rangle = e_j^TXe_i - e_i^TXe_j$, we can rewrite this as $$ T^{-1}(X) = \frac 12\sum_{1 \leq i < j \leq n} (\lambda_i + \lambda_j)^{-1}(e_j^TXe_i - e_i^TXe_j) J_{ij}. $$


Alternatively, we could express the transformation as follows. Let $U$ denote the matrix whose columns are $e_1,\dots,e_n$. Define the transformations $\phi_1,\phi_2,\phi_3$ over the set of antisymmetric matrices as follows: $$ \phi_1(X) = U^TXU,\\ [\phi_2(X)]_{ij} = (\lambda_i + \lambda_j)^{-1} X_{ij} \quad \text{for all } i \neq j, \quad [\phi_2(X)]_{ii} = 0 \quad \text{for } 1 \leq i \leq n,\\ \phi_3(X) = UXU^T. $$ Note that $\phi_1,\phi_3$ are inverse transformations. We find that $$ T^{-1}(X) = \phi_3(\phi_2(\phi_1(X))). $$