Suppose the origin $ x(t) = 0$ of the linear delay differential equation $ \dot{x}(t) = A x(t) + A_d x(t - r(t))$ is asymptotically stable for some delay function $r(t)$. Does this imply the asymptotic stability of $ \dot{x}(t) = A^{T} x(t) + A_d^{T} x(t - r(t))$ for the same $r(t)$?
Note: I am studying the stability of multi-agent systems subject to time-delays. If it is possible to analyze the transpose matrix system, it could potentially simplify the problem I study.