Let $Y$ be a multivariate normal random vector with covariance $\Sigma$. Let $A_0,A_1$ be matrices such that $$A_0\Sigma A_1=0.$$ It is known that in this case $Y'A_0Y$ and $Y'A_1Y$ are independent quadratic forms. For a real vector $C$, is it true that the quadratic forms $$(Y+c)'A_0 (Y+c) \quad \text{and}\quad Y'A_1Y$$ are independent?
I have tried to work with the characteristic functions of these quadratic forms but cannot go further than the statement "product of the char. functions is the char. function of the product".
Edit: If we view the first two quadratic forms as independent variables $f(Y),g(Y)$ where $f,g:\mathbb{R}^d \rightarrow \mathbb{R}$ are functions. Then the question becomes: does it follow that $f(Y+c)$ and $g(Y)$ are independent where $c \in \mathbb{R}^d$ ?
Proof: Let $B$, $C$ be matrices such that $B'B=A_0$ and $C'C=A_1$, $BB'=I_{rank(A_0)}$ and $CC'=I_{rank(A_1)}$ (it is called Takagi factorization).Then it suffices to show that $BY$ and $CY$ are independent random vectors. They are MVN, hence $Cov(BY,CY)=0$ would mean they are independent. Well $$Cov(BY,CY)=B\Sigma C'.$$ We also have $B'B\Sigma C'C=0 \Rightarrow B\Sigma C'=0$ by multiplying left with $B$ and right with $C'$, which means that $Cov(BY,CY)=0.$