Expectation of exponents of Random Matrices

246 Views Asked by At

TLDR: Given a random matrix A, how to find a closed form expression for a deterministic matrix B such that $\mathbb{E}[e^A]=e^B$, in terms of the elements of A?

Background: So this has to do with solutions of switching linear systems of the form $\dot{x}=Ax+Du$. The random matrix A doesn't have any special properties as far as I can tell, other than the fact that has a couple of zero eigenvalues (it is 9x9 for my use case). The D matrix is deterministic. Some of the entries in the matrix are random scalars; that is, they switch between two values randomly. The rest of the entries in the matrix A are deterministic.

I am interested in finding what the 'averaged' system looks like. So I need to find the deterministic system matrix B so that the system $\dot{y}=By+Du$ is the average of the random system: $\mathbb{E}[x]=y$. Here $x$ and $y$ are vectors.

I can find it numerically using Monte-Carlo type simulations (take a large number of realizations of matrix A, find $e^A$ for each, take the average and take the matrix logarithm to get B) but this won't give me an exact representation. So for example if the random elements in the A matrix are a,b,c, I want to find an expression for B in terms of (a,b,c).

Any thoughts or suggestions?

(Also, I'm not really a math student but a dumb engineer, so I have a feeling this is probably child's play to some math majors)

PS: I've added an example of the 9x9 A matrix to give a better idea: Here, a, b, c are binomial random variables, can take the value {0,1} with known probabilities and are independent of each other. But, since some of these variables occur more than once, I cannot claim that all elements of the matrix A are independent of each other.

$A=\begin{bmatrix} 0&1&0& & & & & &\\ 0&0&1 &&&&&&\\ 0&0&-1&&&&&&&\\ &&&0&1&0&&&&\\ &&&0&0&1&&&&\\ 1&1&a&-1&-2&-1&&&\\ &&&&&&0&1&0&\\ &&&&&&0&0&1&\\ b&b&b&1&1&c&(-b-1)&(-2-3b)&-1 \end{bmatrix}$

1

There are 1 best solutions below

0
On

I assume that $A$ and $Du$ are time independent. Then the differential equation

$$ \dot{x}(t) = A x(t) + Du $$

has the following solution (found with the method of "variation of constants")

$$ x(t) = U(t) x(0) + \int_0^t ds U(t-s) Du $$

where

$$ U(t) = e^{tA} $$

satisfies the homogenous ODE:

$$\dot{U}(t) = A U(t).$$

Note that $U(t)$ satisfies the semi-group property $U(t+s)=U(t)U(s)$. Now, $A$ is a random matrix and it seems you are interested in the expectation value of $x(t)$, i.e. $y(t):=\mathsf{E}[x(t)]$.

Defining $$\mathcal{E}(t):=\mathsf{E}[U(t)],$$ we have

$$ y(t) = \mathcal{E}(t) x(0) + \int_0^t ds \mathcal{E}(t-s) Du \tag{1} $$

Next it seems that you are interested in finding an ODE for $y(t)$. I don't think that this may be possible. Indeed, we would proceed in the following way:

Define $$ B(t) : = \dot{\mathcal{E}}(t) \mathcal{E}^{-1}(t) $$

so $\mathcal{E}(t)$ satisfies

$$ \dot{\mathcal{E}}(t) = B(t) \mathcal{E}(t) $$

and we would hope that $y(t)$ satisfied:

$$ \dot{y}(t) = B(t) y(t) + Du \tag{2} $$

Unfortunately, the solution of (2) is

$$ y(t) = \mathcal{E}(t) x(0) + \int_0^t ds \mathcal{E}(t) \mathcal{E}^{-1}(s) Du $$

But this is different from Eq. (1) because $\mathcal{E}(t) \mathcal{E}^{-1}(s)\neq \mathcal{E}(t-s)$ unless $B(t)$ is time independent, i.e. it does not satisfy the semigroup property.

Things would be ok if $Du=0$ or if $B(t)=B$ (though this is unlikely).