Let $W_t$ be a univariate standard Brownian motion. Let $\vec{B}_t$ be a Brownian motion in $\mathbb{R}^m$. Let $\vec{S}_t$ be a vector valued process in $(0,\infty)^d$, driven by $\vec{B}_t$ with a rather general geometric Brownian motion dynamics: for each $i=1,2,\dotsc,d$, the coordinate $S_t^i$ satisfies the SDE $$dS_t^i=\mu^i(t,\vec{S}_.)S_t^i dt+\sum_{j=1}^m \sigma_{ij}(t,\vec{S}_.)S_t^i dB_t^j.$$
Here the notation $\sigma_{ij}(t, \vec{S}_.)$ means that $\sigma_{ij}$ is a function of time $t$ and the entire path of $\vec{S}$ up to time $t$, and hence so is the $d\times m$ matrix $\sigma(t, \vec{S}_.)$. Note that $\Sigma=\sigma \sigma^T$ is thus $d\times d$ and represents the covariance matrix of the system $\log\vec{S}$ (the logarithm applied component wise, I apologize for these slight abuses of notation). Similar comments apply to $\mu^i(t, \vec{S}_t)$ and $\vec{\mu}(t, \vec{S}_t)$.
Consider two processes $X_t^{(1)}$ and $X_t^{(2)}$ defined by the SDEs $$dX_t^{(1)} = \sum_{i=1}^d \sum_{j=1}^m \alpha_t^i \sigma_{ij}(t,\vec{S}_.)X_t^{(1)} dB_t^j,$$ and $$dX_t^{(2)} = \sqrt{\vec{\alpha}_t^T \Sigma(t, \vec{S}_.) \vec{\alpha}_t} X_t^{(2)}dW_t,$$ where $\Sigma=\sigma\sigma^T$.
Conjecture: If $X_0^{(1)}=X_0^{(2)}$ a.s., then for all $t>0$, $$X_t^{(1)} \stackrel{(d)}{=} X_t^{(2)},$$ i.e. these are equal in distribution at every time $t>0$.
Special case: I believe we can prove this when $\sigma_{ij}$ are functions of time $t$ alone. Indeed, it is easy to see the quadratic variations are extremely similar: $$d[X]_t^{(i)} = \vec{\alpha}_t^T \Sigma(t) \vec{\alpha}_t (X_t^{(i)})^2 dt.$$ Applying Ito's lemma to $\log X_t^{(i)}$ shows that the both $\log X_t^{(i)}$ are Gaussian with mean $$-\frac12 \int_0^t \vec{\alpha}_u^T \Sigma(u)\vec{\alpha}_u du,$$ and variance $$\int_0^t \vec{\alpha}_u^T \Sigma(u)\vec{\alpha}_u du.$$ Thus, the two are equal in distribution at each time.
Question Is this conjecture true in general? How can we proceed to prove it?
Please comment for clarifications or corrections, especially for my little sketch of a proof of the special case, thanks!
The conjecture holds true only in the case where $\sigma_{ij}$ depends only on $t$ as in your special case.
We build a counter-example for the case where $\sigma_{ij}$ is a function of $t$ and $S$.
For the sake of notation clarity, I write the indices in subscript, for example $S_{2,t}$ (and not in superscript as in your question $S^{2}_t$).
Let us suppose
Then
It's easy to deduce the analytical solution of $X^{(1)}_t$ and $X^{(2)}_t$ from $(1)$ and $(2)$: $$\begin{align} (1) &\Longrightarrow d{\ln\left(X^{(1)}_t \right)} = -\frac{1}{2}\left(B_{1,t}^2+B_{2,t}^2 \right)dt+d(B_{1,t}B_{2,t}) \\ &\Longrightarrow \color{red}{X^{(1)}_t = X^{(1)}_0\exp\left( -\frac{1}{2}\int_0^t\left(B_{1,s}^2+B_{2,s}^2 \right)ds+B_{1,t}B_{2,t} \right)} \tag{3} \end{align}$$ $$\begin{align} (2) &\Longrightarrow d{\ln\left(X^{(2)}_t \right)} = -\frac{1}{2}(B_{1,t}+B_{2,t})^2 dt+\left|B_{1,t} + B_{2,t} \right|dW_t \\ &\Longrightarrow \color{blue}{X^{(2)}_t = X^{(2)}_0\exp\left( -\frac{1}{2}\int_0^t(B_{1,s}+B_{2,s})^2ds+ \int_0^t \left|B_{1,s} + B_{2,s} \right|dW_s \right)} \tag{4} \end{align}$$
Even if we suppose $X^{(1)}_0 = X^{(2)}_0 $, we can not have $X^{(1)}_t \stackrel{(d)}{=}X^{(2)}_t$. Indeed, let's study the two exponents
$$\begin{align} \text{Exponent}(3)-\text{Exponent}(4) &=-\frac{1}{2}\int_0^t\left(B_{1,s}^2+B_{2,s}^2 \right)ds+B_{1,t}B_{2,t}+\frac{1}{2}\int_0^t(B_{1,s}+B_{2,s})^2ds- \int_0^t \left|B_{1,s} + B_{2,s} \right|dW_s\\ &=\color{red}{\int_0^t(B_{1,s}B_{2,s})ds+B_{1,t}B_{2,t}}- \color{blue}{ \int_0^t \left|B_{1,s} + B_{2,s} \right|dW_s} \tag{5} \end{align}$$ If $X^{(1)}_t \stackrel{(d)}{=}X^{(2)}_t$, the red term of $(5)$ must equal to the blue term in law. Their variances must be then equal. However,
By Ito isometric, the variance of the blue term of $(5)$ is equal to $$\begin{align} V\left(\int_0^t \left|B_{1,s} + B_{2,s} \right|dW_s \right) &= \mathbb{E}\left(\left(\int_0^t \left|B_{1,s} + B_{2,s} \right|dW_s \right)^2 \right) \\ &= \int_0^t \mathbb{E}\left|B_{1,s} + B_{2,s} \right|^2ds \\ &= \int_0^t 2sds \\ &= t^2 \\ \end{align}$$
The variance of the red term: $$\begin{align} L&:= V\left(\int_0^t(B_{1,s}B_{2,s})ds+B_{1,t}B_{2,t} \right)\\ &=\mathbb{E}\left(\int_0^t(B_{1,s}B_{2,s})ds+B_{1,t}B_{2,t} \right)^2 \\ &=\mathbb{E}\left(\int_0^t(B_{1,s}B_{2,s})ds\right)^2 +2\mathbb{E}\left(B_{1,t}B_{2,t}\int_0^t(B_{1,s}B_{2,s})ds \right) +\mathbb{E}\left(B_{1,t}B_{2,t} \right)^2 \tag{6} \end{align}$$ We can compute $(6)$ analytically but it is time consuming for me. Just observe that, as $B_{1,t}$ and $B_{2,t}$ are independent, then
Q.E.D