How to prove a process is a forward process base on the time independence of an inner product

22 Views Asked by At

I met a problem in stochastic calculus assignment, which writes:

Suppose $\vec{y}(t), \vec{x}(t) \in \mathbb{R}^n$ are two vectors. A backward process for the nonsingular $n\times n$ matrix $A$ is defined as a process $\vec{x}(t)$ satisfying $\frac{d}{dx}\vec{x}(t)=A\vec{x}(t)$. Suppose $<\vec{y}(t), \vec{x}(t)>$ is independent of time $t$ for any backward process $\vec{x}(t)$, show that $\vec{y}(t)$ is a forward process in the sense that $\frac{d}{dx}\vec{y}(t)=A^*\vec{y}(t)$ where $A^*$ is $A$'s adjoint. $<>$ denotes some inner product in $\mathbb{R}^n$.

Let's assume all vectors are column vectors. What I thought is, let $G$ is a symmetric positive definite matrix for the inner product. Then $$ \frac{d}{dt}<\vec{y}(t), \vec{x}(t)> = (\frac{d}{dt}\vec{y}^T(t))G\vec{x}(t) + \vec{y}^T(t)G\frac{d}{dt}\vec{x}(t)= <\frac{d}{dt}\vec{y}^T(t),\vec{x}(t)> + <\vec{y}^T(t),\frac{d}{dt}\vec{x}(t)> = 0 $$ We also have \begin{align*} <\frac{d}{dt}\vec{y}^T(t),\vec{x}(t)> + <\vec{y}^T(t),\frac{d}{dt}\vec{x}(t)> &= <\frac{d}{dt}\vec{y}^T(t),\vec{x}(t)> + <\vec{y}^T(t),A\vec{x}(t)> \\ &= <\frac{d}{dt}\vec{y}^T(t),\vec{x}(t)> + <A^*\vec{y}^T(t),\vec{x}(t)> \end{align*} Does it suggest that $\frac{d}{dt}\vec{y}(t) + A^*\vec{y}(t)=0$? Am I wrong somewhere?

Thank you for any help!