First formulation of my problem :
Let $A_1,\cdots,A_j$ be hermitian definite positive matrices of dimension $n$ with all their eigenvalues in $(0;1]$. We also add the condition that $\|A_1\cdots A_j\|_2^2<1$. I am wondering if the quantity
$$\frac{-1}{k}\log\left(\rho\left(A_1^k\cdots A_j^k\right)\right)$$
has a (positive and finite) limit when $k$ goes to infinity. Here $\rho(M)$ denote the spectral radius of the matrix $M$.
Second formulation :
The same problem can be reformulated in terms of ordinary differential equation : let $a : [0;1]\to \mathscr H_n^{+}$ be a bounded, piecewise continuous function from $[0;1]$ to the space of hermitian semi definite positive matrices and define $G_a :[0;1]\to \mathscr{M}_n(\mathbf C)$ as the solution of the following differential equation
$$ \left\{ \begin{array}{rcl} G_a(0)&=&\mathrm{Id}\\ G_a'(t)&=&-a(t)G(t). \end{array} \right. $$ We moreover assume that $\|G_a(1)\|_2^2<1$. I want to know if the quantity $$\frac{-1}{k}\log\big(\rho\left(G_{ka}(1)\right)\big)$$ has a finite positive limit when $k\to \infty$. This is indeed a reformulation of the first problem if we take $a$ to be piecewise constant on every interval $[\frac{i-1}{j};\frac{i}{j})$ and such that $\exp(-a(i-1)/j)=A_i$. By an argument of density the two problems are equivalent.
What I already know : Note that in dimension $1$ (ie the matrices are just real numbers) the answer to my question is trivially yes. The few numerical simulations i have done (and my intuition) seems to indicate that the answer to my question is yes : there is a positive finite limit. Finally, i was able to prove an upper and a lower bound for the first formulation of the problem, these bounds are not sharp and i will only sketch the proof of the bounds :
first bound : All the $A_i^k$ are contractions of $\mathbf C^n$ so $\rho\left(A_1^k\cdots A_j^k\right)<1$ and thus $\det\left(A_1^k\cdots A_j^k\right)<1$ but $\det\left(A_1^k\cdots A_j^k\right)=\det\left(A_1\cdots A_j\right)^k$ and thus $\rho\left(A_1^k\cdots A_j^k\right)\geq \det\left(A_1\cdots A_j\right)^{k}$, giving us the first bound.
second bound : One can show that all the coefficients of $A_1^k$ are of the form $C(\alpha+o(1))^k$ for some $C$ and $\alpha$. The product and the sum of two of this expressions can still be exprimed in that way. From that we deduce that the coefficients of the characteristic polynomial of $A_1^k\cdots A_j^k$ are also of the form $C(\alpha+o(1))^k$, so it's an exponentially small perturbation of the polynomial $X^n$. Since the roots of a polynomial (of degree $n$) are $1/n$-Hölder functions of the coefficients we get the other bound on $\rho\left(A_1\cdots A_j\right)^k$.
This second bound suggest to concider the matrices $A_1^k\cdots A_j^k$ as small perturbations of the null matrix, maybe this approach is fruitfull but i don't know much about perturbation theory.
Any $n\times n$ Hermitian matrix $B$ has $n$ real eigenvalues $\lambda_j$ and $n$ eigenvectors $u_j$ such that
$$\lambda_j u_j=Bu_j.$$
We can choose the eigenvectors such that $\Vert u_j\Vert_2=1$ or that they are an orthonormal base. We define the matrix
$$X:=\begin{pmatrix} u_1 \\ \vdots \\ u_n \end{pmatrix}$$
with the row vectors $u_j$ such that it is a unitary complex matrix. Then there exists a decomposition
$$B=\overline X^t \Lambda X$$
with the diagonal matrix $\Lambda=(\lambda_j)$ and the conjugate transpose $\overline X^t$ of the matrix $X$. Moreover this conjugate transpose is the inverse of the matrix X or
$$E=\overline X^t X =X\overline X^t$$
with the identity matrix $E$. Now we easily obtain
$$B^2={(\overline X^t \Lambda X)\cdot(\overline X^t \Lambda X)}={\overline X^t \Lambda X\overline X^t \Lambda X}={\overline X^t \Lambda\Lambda X}={\overline X^t \Lambda^2 X}$$
and in general
$$B^2={\overline X^t \Lambda^k X}.$$
But we immediately obtain $\Lambda^k=(\lambda_j^k)$.
With these notations we define
$$A_i=\overline X_i^t \Lambda_i X_i$$
with the eigenvalues $\lambda_{ij}$ and the eigenvectors $u_{ij}$. We immediately get
$$\tag{1}R_k:=A_1^k A_2^k \cdots A_i^k={(\overline X_1^t \Lambda_1^k X_1)\cdot (\overline X_2^t \Lambda_2^k X_2)\cdots(\overline X_i^t \Lambda_i^k X_i)}.$$
In this product we replace the eigenvalues $\lambda_{ij}^k$ with the variables $\xi_{ij}=\lambda_{ij}^k$ and obtain $\Lambda_i^k=(\xi_{ij}).$
Now we simply multiply out the matrices (1) and the resulting matrix $R_k$ can be written as
$$R_k=(r_{\mu\nu})=(r_{\mu\nu}((\xi_{ij})_{i\times n})=(r_{\mu\nu}(\xi_{11},\dots\xi_{1n},\dots,\xi_{i1},\dots\xi_{in}))$$
so that every coefficient $r_{\mu\nu}$ is a polynomial in the $(\xi_{ij})_{i\times n}$. There must be a monomial $\xi_{1s_1}\xi_{2s_2}\cdots\xi_{is_i}=(\lambda_{1s_1}\lambda_{2s_2}\cdots\lambda_{is_i})^k$ such that $\lambda_{max}:=\vert\lambda_{1s_1}\lambda_{2s_2}\cdots\lambda_{is_i}\vert$ is maximal for a certain choice $1\le s_\nu\le n$ and at least one coefficient
$$\dfrac{r_{\mu\nu}}{\lambda_{max}^k}$$
is unequal to $0$. Otherwise the matrix $R_k$ is $0$ and nothing is left to do. We finally define
$$\hat R_k:=\dfrac{1}{\lambda_{max}^k}R_k.$$
With $k\to\infty$ the matrix $\hat R_\infty$ is a matrix with complex coefficients in which all monomials of the matrix $R_k$ with a value $\vert\lambda_{1s_1}\lambda_{2s_2}\cdots\lambda_{is_i}\vert\lt\lambda_{max}$ are zero and only the monomials with the absolute value $\lambda_{max}$ are left. The division of these monomials through $\lambda_{max}^k$ leads to a complex value for each coefficient $\hat r_{\mu\nu}$. It is well known from functional analysis that the spectrum $Spec\{\hat R_k\}$ of the matrices $\hat R_k$ converges to the spectrum $Spec\{\hat R_\infty\}$ (see for example MathOverflow:Convergence of eigenvectors). The maximal absolute eigenvalue $s_{max}$ of this spectrum is finite and the spectral radius $\rho(\hat R_k)=\lim_{m\to\infty}\sqrt[m]{\left\Vert \hat R_k^m\right\Vert}$. In the case $s_{max}>0$ we obtain
$$\tag{2}\lim_{k\to\infty}\sqrt[k]{\rho(R_k)}=\lim_{k\to\infty}\sqrt[k]{\rho( \lambda_{max}^k\hat R_k)}\to\lim_{k\to\infty}\sqrt[k]{\lambda_{max}^k\cdot s_{max}}=\lambda_{max}$$
so that the limit exists and it is a product of spectral values $\vert\lambda_{1s_1}\lambda_{2s_2}\cdots\lambda_{is_i}\vert$ for a certain choice of an eigenvalue $\lambda_{is_i}$ of each matrix $A_i$.
In the case that $s_{max}=0$ we have to dig a little deeper. In this case we determine the characteristic polynomial
$$p_k(\lambda):=\lambda^n+a_{n-1}\lambda^{n-1}+\dots+a_0:=det(\lambda E-\hat R_k)$$
of the Matrix $\hat R_k$. If all coefficients $a_j$ are zero we take $s_{max}=0$. Otherwise the coefficients $a_j$ converge to $0$ with growing k and there exists an $\alpha\gt0$ such that the polynomials
$$\hat p_k(\lambda)=\alpha^{nk}p_k\left(\dfrac{\lambda}{\alpha^k}\right)=\lambda^n+a_{n-1}\alpha^{k}\lambda^{n-1}+\dots+\alpha^{nk}a_0$$
converge to a polynomial $\hat p_\infty(\lambda)$ with bounded coefficients and at least one coefficient is unequal to $0$. But then the polynomials $\hat p_k(\lambda)$ converge to a maximal zero $\lambda_0\ne0$ and we take $s_{max}=\vert\lambda_0\vert\alpha^{-k}$ in the limit (2).