Having a few problems with proofs regarding Matrix Multiplication and Markov Chains

33 Views Asked by At

I'm sorry if my formatting is sloppy, this is my first time posting on this forum, and I'm not certain how to use the site's formatting.

I need to prove four things, and I'm running into a few road blocks. Specifically, I need to prove the following:

  1. PROVE: If $v$ is a probability vector (all entries add up to $1$), and A is a stochastic matrix (meaning a square matrix where the columns are probability vectors), prove $Av$ is another probability vector.

    • For this one, I was given the hint of using $S= [1, 1, 1, ...]$, and to remember $VS=v_1+v_2+...$
  2. SHOW: $v$ is a probability vector if and only if $Sv=1$, and $v(i)≥0$

    • I also need to show that A is stochastic if and only if $SA=S$ and all entries $≥0$.
    • The hint given for this one is that $B[a_1, a_2, ..., a_n]$ = $[Ba_1, Ba_2, ..., Ba_n]$
  3. Let A be stochastic. If $(λ,v)$ is an eigenpair, then either $λ=1$, or the sum of the entries in $v=0$.

    • The hint here is to use $S=[1, 1, ...]$.
  4. Let A be a stochastic $2 \times 2$ matrix. Show that $1$ is an eigenvalue of A, and the other eigenvalue satisfies $|λ|≤1$.

    • The thing to remember is that, as a stochastic matrix, the entries are $[a, b, 1-a, 1-b]$.

Any help for any or all of these would be greatly appreciated. Thank you in advance.

1

There are 1 best solutions below

0
On

Allow me to use lowercase $\mathbf{s}$ for the vector $(1,\ldots,1)$. By $\sum_i$, I mean $\sum_{i=1}^{n}$, where $n$ is the dimension of your objects.

  1. To prove an equivalence, you need to prove two implications. The assumption that all entries $\geq 0$ serves to prevent values larger than $1$ cancelling out with negative values smaller than $-1$, other than that, it is not needed in the proof.

$\implies$: we assume that $\mathbf{v}$ is a probability vector, that means $\sum_i \mathbf{v}_i=1$. If we multiply the two vectors, we obtain $\mathbf{sv}=\sum_i \mathbf{s}_i\mathbf{v}_i$. Since $\forall i$ $\mathbf{s}_i=1$, we can further simplify $\sum_i \mathbf{s}_i\mathbf{v}_i=\sum_i \mathbf{v}_i$, which we assumed to be equal to 1.

$\impliedby$: Pretty much read the whole thing backwards. Assume $\mathbf{sv}=1$, then $1=\mathbf{sv}=\sum_i \mathbf{s}_i\mathbf{v}_i=\sum_i \mathbf{v}_i$.

As for the matrix case, use the just proven statement for every column individually, since the columns add up to $1$, just like $\mathbf{v}$.

  1. For an eigenpair $(\lambda,\mathbf{v})$, the following holds: $$\lambda\mathbf{v}=A\mathbf{v}$$ Let's multiply the equation by $\mathbf{s}$ from the left and exploit what we proved in 2): $$\lambda=\lambda\mathbf{sv}=\mathbf{s\lambda v}=\mathbf{s}A\mathbf{v}\overset{2)}{=}\mathbf{sv}\overset{2)}{=}1$$ $\lambda = 1$ is clearly a solution, the case $\sum_i\mathbf{v}_i=0$ is in conflict with the definition of a probability vector.

  2. You get the eigenvalues as roots of a determinant of the matrix \begin{pmatrix} a-\lambda & b \\ 1-a & 1-b-\lambda \\ \end{pmatrix} that is $$0\overset{!}{=}(a-\lambda)(1-b-\lambda)-b(1-a)=\lambda^2+(b-a-1)\lambda+(a-b)$$ Notice that $1$ is a root. Since $$\big{(}\lambda^2+(b-a-1)\lambda+(a-b)\big{)}:\big{(}\lambda-1\big{)}=\lambda + b - a$$ the other eigenvalue is $a-b$, which is always $\in[-1,1]$