Consider the differential equation
$$ \frac{\text{d} y}{\text{d} t} = \left(c_0 + c_1 t^{-1} + c_2 t^{-2} + \dots c_n t^{-n} \right) y ;\quad y(1) = y_0 > 0; t \geq 1. $$
where $c_i > 0$ for each $i$. This can be solved directly and we find, for some $\alpha \neq 0$, that $$ y(t) \sim \alpha t^{c_1} e^{c_0 t}. $$ where $f(t) \sim g(t)$ if $\lim_{t \to \infty} y(t)/g(t) = 1$.
I'm interested in the generalization of this result to the case of a system of differential equations. Basically, I think something like the following is probably true or almost true:
Prop. 1. Let $A_0$ be a $d \times d$ non-negative irreducible matrix with Perron-Frobenius eigenvalue $\lambda$, and let $A_1, A_2 \dots A_n$ be nonnegative $d \times d$ matrices. Then, the system $$ \frac{\text{d} \mathbf{y}}{\text{d} t} = \left(A_0 + A_1 t^{-1} + A_2 t^{-2} + \dots A_n t^{-n} \right) \mathbf{y} ;\quad \mathbf{y}(1) = \mathbf{y}_0 > \mathbf{0}; t \geq 1. $$ has, for some $\alpha_i \neq 0$ and some $\beta_i$, $$ |\mathbf{y}_i(t)| \sim \alpha_i t^{\beta_i} e^{\lambda t} \quad 1 \leq i \leq d. $$
Question: I'm looking for a reference which shows that Prop. 1 is true, or true provided with some other technical condition. It seems similar to Theorem 19.1 of Asymptotic Expansions for Ordinary Differential Equations by Wasow but I don't fully see the connection.

Let's find directly the asymptotic result of Prop.1.
Diagonalize the matrix $A_0$, with $D$ the diagonal matrix having diagonal elements equal to $(\lambda_i)$ for $i=1,...,d$ and $U$ the matrix of eigenvectors. $$A_0 = U^{-1}DU$$ The system of equation is then equivalent to $$\begin{align} &\Longleftrightarrow \frac{d }{d t}\mathbf{y} = \left(U^{-1}DU + A_1 t^{-1} + A_2 t^{-2} + \dots A_n t^{-n} \right) \mathbf{y}\\ &\Longleftrightarrow \frac{d }{d t}(U\mathbf{y}) = \left(DU + UA_1 t^{-1} + UA_2 t^{-2} + \dots UA_n t^{-n} \right) \mathbf{y} \\ &\Longleftrightarrow \frac{d }{d t}(U\mathbf{y}) = \left(D + UA_1U^{-1} t^{-1} + UA_2U^{-1} t^{-2} + \dots UA_nU^{-1} t^{-n} \right) (U\mathbf{y}) \tag{1}\\ \end{align}$$
Let's denote $\mathbf{z} = U\mathbf{y}$, then
$$(1) \Longleftrightarrow \frac{d }{d t}\mathbf{z} = \left(D + UA_1U^{-1} t^{-1} + UA_2U^{-1} t^{-2} + \dots UA_nU^{-1} t^{-n} \right) \mathbf{z} \tag{2}$$
The $i^{\text{th}}$ equation of $(2)$, for $i=1,...,d$ has the following form $$\frac{d z_i }{d t}(t) = \lambda_i z_i(t)+ t^{-1}\sum_{j=1}^d (\alpha_{1j}z_{j}(t)) + \color{red}{t^{-2}}\sum_{j=1}^d (\alpha_{2j}z_{j}(t)) + \dots + \color{red}{t^{-n}}\sum_{j=1}^d (\alpha_{nj}z_{i}(t)) \tag{3} $$
For $t$ large ($t\to +\infty$), the terms in red color become small. By the same argument as for the unidimensional case, we can ignore these terms, and then, for $i=1,...,d$, we have
$$(3) \implies \frac{d z_i }{d t}(t) \sim \lambda_i z_i(t) + \frac{1}{t}\sum_{j=1}^d (\alpha_{1j}z_{j}(t)) \qquad \forall i=1,...,d \tag{4}$$
Let's solve the system of equations $(4)$, we have $$\begin{align} (4) &\Longleftrightarrow \left(e^{-\lambda_i t} z_i(t) \right)' = \frac{1}{t}\sum_{j=1}^d \left(\alpha_{ij} e^{-\lambda_i t}z_{j}(t) \right) \qquad \forall i=1,...,d \end{align}$$
For the sake of simplicity, let's denote $p_i(t) = e^{-\lambda_i t} z_i(t)$, and $ \mathbf{p} =\left(p_1(t),...,p_d(t) \right)$. The matrix form of the relation between $p_i(t)$ and $z_i(t)$ is $$\mathbf{p} = \Gamma^{-1}\mathbf{z}$$ with $\Gamma$ - a diagonal matrix having elements $e^{\lambda_i t}$ for $i=1,...,d$
And the matrix form of the new system of equation is $$\frac{d}{dt} \mathbf{p} = \frac{1}{t} UA_1U^{-1} \mathbf{p} \tag{5}$$
Diagonalize the matrix $A_1$ with $H$ the diagonal matrix having diagonal elements $\beta_1,...,\beta_d$ and $V$ the matrix of eigenvectors.
$$A_1 = V^{-1}HV$$
then $$\begin{align} (5) &\Longleftrightarrow \frac{d}{dt}\mathbf{p} = \frac{1}{t} UV^{-1}HVU^{-1} \mathbf{p} \\ &\Longleftrightarrow \frac{d}{dt}(VU^{-1}\mathbf{p} )= \frac{1}{t} H (VU^{-1} \mathbf{p}) \tag{6}\\ \end{align}$$
$(6)$ can be solved easily. Indeed, each equation of $(6)$ has the following form $$\frac{d}{dt}q_i(t) = \frac{\beta_i}{t}q_i(t) \Longleftrightarrow q_i(t) = c_it^{\beta_i} \qquad \forall i=1,...,d $$
Then $$\begin{align} (6)&\Longleftrightarrow VU^{-1}\mathbf{p} = \begin{pmatrix} c_1t^{\beta_1} \\ c_2t^{\beta_2} \\ \cdot \cdot \cdot\\ c_dt^{\beta_d} \\ \end{pmatrix} \\ &\Longleftrightarrow VU^{-1}\Gamma^{-1}\mathbf{z} = \begin{pmatrix} c_1t^{\beta_1} \\ c_2t^{\beta_2} \\ \cdot \cdot \cdot\\ c_dt^{\beta_d} \\ \end{pmatrix} \\ &\Longleftrightarrow \Gamma^{-1}VU^{-1}\mathbf{z} = \begin{pmatrix} c_1t^{\beta_1} \\ c_2t^{\beta_2} \\ \cdot \cdot \cdot\\ c_dt^{\beta_d} \\ \end{pmatrix} \qquad \text{as } \Gamma \text{ is diagonal matrix}\\ &\Longleftrightarrow VU^{-1}U\mathbf{y} = \Gamma\begin{pmatrix} c_1t^{\beta_1} \\ c_2t^{\beta_2} \\ \cdot \cdot \cdot\\ c_dt^{\beta_d} \\ \end{pmatrix} \\ &\Longleftrightarrow \color{red}{\mathbf{y} = V^{-1} \begin{pmatrix} c_1t^{\beta_1} e^{\lambda_1 t}\\ c_2t^{\beta_2}e^{\lambda_2 t} \\ \cdot \cdot \cdot\\ c_dt^{\beta_d}e^{\lambda_d t} \\ \end{pmatrix}} \tag{7} \\ \end{align}$$ with $V$ the matrix of eigenvectors defined by $A_1 = VHV^{-1}$.
$(7)$ is the more accurate solution than the one in your Prop.1. If we seek only the most major term for each $y_i(t)$, then we can do as follows
$$\color{red}{y_i(t) = \sum_{j=1}^d v_{ij}c_jt^{\beta_j}e^{\lambda_j t}} \sim v_{i\color{blue}{g}(i)}c_{\color{blue}{g}(i)} t^{\beta_{\color{blue}{g}(i)}}e^{\lambda{_\color{blue}{g}(i)} t} \qquad \forall i=1,...,d $$
with $\color{blue}{g}(i) =\underset{j=1,...,d}{\operatorname{argmax}} \{\lambda_j | v_{ij}c_j \ne 0\}$.
Remark: if all the coefficients $v_{ij}, c_j$ for $1\le i,j \le d$ are not $0$, then for all $i=1,...,d$, we have $\color{blue}{g}(i) = \color{blue}{g} =\underset{j=1,...,d}{\operatorname{argmax}} \{\lambda_j \}$ be the index $j^{th}$ such that $\lambda_j=\color{purple}{\lambda}$ (the P.E eigenvalue), and so we have $$y_i(t) \sim v_{i\color{blue}{g}}c_\color{blue}{g} t^{\beta_\color{blue}{g}}e^{\color{purple}{\lambda} t} \qquad \forall i=1,...,d $$