Given are two matrices $A_0, A_1$, whose symmetric part is negative definite:
$A_{0}^{T} + A_{0} < 0$,
$A_{1}^{T} + A_{1} < 0$
How could one proof that: $A_{2}^{T} + A_{2} < 0$ for
$A_{2} = (A_{1}A_{0}^{-1})^{\alpha}A_{0}$ and $\alpha \in [0, 1]$?
Given are two matrices $A_0, A_1$, whose symmetric part is negative definite:
$A_{0}^{T} + A_{0} < 0$,
$A_{1}^{T} + A_{1} < 0$
How could one proof that: $A_{2}^{T} + A_{2} < 0$ for
$A_{2} = (A_{1}A_{0}^{-1})^{\alpha}A_{0}$ and $\alpha \in [0, 1]$?
Copyright © 2021 JogjaFile Inc.