Is there a way for me to change the order of multiplying these matricies?

50 Views Asked by At

Given an equation of this form: \begin{equation} \vec{X}=\sigma(t) B^{-1} \label{MasterEq} \end{equation} where $\vec{X}$ has the following components: \begin{equation} x_i= \sum_{\alpha = 0}^{n}{a_{i,\alpha} \cos{(\omega_\alpha t})}+b_{i,\alpha} \sin{(\omega_\alpha t)} \label{periodicSum} \end{equation} $B$ is a square $n \times n$ matrix, and $\sigma$ is a $1 \times n$ vector defined by: \begin{equation} \sigma(t) =\begin{bmatrix} \beta_1\cos{(\omega_1 t})+\delta_1 \sin{(\omega_1 t)} \\ \beta_2\cos{(\omega_2 t})+\delta_2 \sin{(\omega_2 t)} \\ \vdots \\ \beta_n\cos{(\omega_n t})+\delta_n \sin{(\omega_n t)} \end{bmatrix} \label{normalCoordinatesMotion} \end{equation} I wrote some code to find the betas and deltas as vectors in my use case, which I then combined into the following matrix: \begin{equation} \Pi =\begin{bmatrix} \beta_1 & \beta_2 & \dots & \beta_n \\ \delta_1 & \delta_2 & \dots & \delta_n \\ \end{bmatrix} \end{equation} How would I go about reimplementing this matrix into my original formula so that I can solve for $\vec X$? I want to combine this matrix with $B$, so that I have one matrix responsible for the sine-cosine parts, and one matrix that’s the betas, deltas, and $B$ combined. Is this possible?

1

There are 1 best solutions below

0
On

A couple observations on your post

  1. Does index $\alpha$ really start at 0 rather than 1? It just seems odd that vector $\vec x$ uses sines and cosines of $\omega_0$ terms but vector $\sigma$ does not. Also, matrix $B$ may not be square (and invertible) anymore.
  2. You state $\sigma$ is a row vector, but you define it as a column vector.

Let $$c := \begin{bmatrix} \cos(\omega_1t) \\ \cos(\omega_2t) \\ \vdots \\ \cos(\omega_nt) \end{bmatrix}, \quad s := \begin{bmatrix} \sin(\omega_1t) \\ \sin(\omega_2t) \\ \vdots \\ \sin(\omega_nt) \end{bmatrix}, \quad \beta := \begin{bmatrix} \beta_1 \\ \beta_2 \\ \vdots \\ \beta_n \end{bmatrix}, \quad \delta := \begin{bmatrix} \delta_1 \\ \delta_2 \\ \vdots \\ \delta_n \end{bmatrix}, \quad \Pi := \begin{bmatrix} \beta^\text T \\ \delta^\text T \end{bmatrix} = \begin{bmatrix} \beta & \delta \end{bmatrix}^\text T, \quad \vec 1_2 := \begin{bmatrix} 1 \\ 1 \end{bmatrix}.$$ I’m assuming coefficients $a_{i,\alpha}$ and $b_{i,\alpha}$ are coming from some matrices $A := [b_{i,j}] _{i,j=1} ^n$ and $B := [b_{i,j}] _{i,j=1} ^n$. Then vector $\vec x$ can be computed as $$\vec x = A \, c +B \, s \equiv \begin{bmatrix} Ac & Bs \end{bmatrix} \vec 1_2.$$ Similarly, $\sigma$ may be seen as $$\DeclareMathOperator{\diag}{\operatorname{diag}} \sigma = \diag(\beta) \, c +\diag(\delta) \, s \equiv \diag(c) \, \beta +\diag(s) \, \delta,$$ where the $\diag(\cdot)$ convert colimn vectors into diagonal matrices. More equivalent expressions for $\sigma$ make use of the entrywise (Hadamard) matrix product $\odot$: $$\sigma \equiv \beta \odot c +\delta \odot s = \big( \begin{bmatrix} \beta & \delta \end{bmatrix} \odot \begin{bmatrix} c & s \end{bmatrix} \big) \vec 1_2 = \big( \Pi^\text T \odot \begin{bmatrix} c & s \end{bmatrix} \big) \vec 1_2.$$ Another equivalence is that $\sigma$ equals the main diagonal of matrix $\Pi^\text T \begin{bmatrix} c & s \end{bmatrix}^\text T$ or the transpose $\begin{bmatrix} c & s \end{bmatrix} \Pi$, but I don’t deem it that helpful. I wouldn’t combine $\sigma$ with $B^{-1}$, either.